Skip to content

Commit 22aab8f

Browse files
committed
docs: add installation guide
1 parent 0a97e25 commit 22aab8f

File tree

3 files changed

+210
-4
lines changed

3 files changed

+210
-4
lines changed
Lines changed: 186 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,186 @@
1+
# Installation
2+
3+
The only prerequisite for installing Nx is Elixir itself. If you don´t have Elixir installed
4+
in your machine you can visit this [intallation page](https://elixir-lang.org/install.html).
5+
6+
There are several ways to install Nx (Numerical Elixir), depending on your project type and needs.
7+
8+
## Using Mix in a standardElixir Project
9+
10+
If you are working inside a Mix project, the recommended way to install Nx is by adding it to your mix.exs dependencies:
11+
12+
1. Open mix.exs and modify the deps function:
13+
14+
```elixir
15+
defp deps do
16+
[
17+
{:nx, "~> 0.5"} # Install the latest stable version
18+
]
19+
end
20+
```
21+
22+
2. Fetch the dependencies, run on the terminal:
23+
24+
```sh
25+
mix deps.get
26+
```
27+
28+
## Installing Nx from GitHub (Latest Development Version)
29+
30+
If you need the latest, unreleased features, install Nx directly from the GitHub repository.
31+
32+
1. Modify mix.exs:
33+
34+
```elixir
35+
defp deps do
36+
[
37+
{:nx, github: "elixir-nx/nx", branch: "main"}
38+
]
39+
end
40+
41+
```
42+
43+
2. Fetch dependencies:
44+
45+
```sh
46+
mix deps.get
47+
48+
```
49+
50+
## Installing Nx in a Standalone Script (Without a Mix Project)
51+
52+
If you don’t have a Mix project and just want to run a standalone script, use Mix.install/1 to dynamically fetch and install Nx.
53+
54+
```elixir
55+
Mix.install([:nx])
56+
57+
require Nx
58+
59+
tensor = Nx.tensor([1, 2, 3])
60+
IO.inspect(tensor)
61+
62+
```
63+
64+
Run the script with:
65+
66+
```sh
67+
elixir my_script.exs
68+
69+
```
70+
71+
Best for: Quick experiments, small scripts, or one-off computations.
72+
73+
## Installing the Latest Nx from GitHub in a Standalone Script
74+
75+
To use the latest development version in a script (without a Mix project):
76+
77+
```elixir
78+
Mix.install([
79+
{:nx, github: "elixir-nx/nx", branch: "main"}
80+
])
81+
82+
require Nx
83+
84+
tensor = Nx.tensor([1, 2, 3])
85+
IO.inspect(tensor)
86+
```
87+
88+
Run:
89+
90+
```sh
91+
elixir my_script.exs
92+
93+
```
94+
95+
Best for: Trying new features from Nx without creating a full project.
96+
97+
## Installing Nx with EXLA for GPU Acceleration
98+
99+
To enable GPU/TPU acceleration with Google’s XLA backend, install Nx along with EXLA:
100+
101+
1. Modify mix.exs:
102+
103+
```elixir
104+
defp deps do
105+
[
106+
{:nx, "~> 0.5"},
107+
{:exla, "~> 0.5"} # EXLA (Google XLA Backend)
108+
]
109+
end
110+
```
111+
112+
2. Fetch dependencies:
113+
114+
```sh
115+
mix deps.get
116+
```
117+
118+
3. Run with EXLA enabled:
119+
120+
```elixir
121+
EXLA.set_preferred_backend(:tpu)
122+
```
123+
124+
Best for: Running Nx on GPUs or TPUs using Google’s XLA compiler.
125+
126+
## Installing Nx with Torchx for PyTorch Acceleration
127+
128+
To run Nx operations on PyTorch’s backend (LibTorch):
129+
130+
1. Modify mix.exs:
131+
132+
```elixir
133+
defp deps do
134+
[
135+
{:nx, "~> 0.5"},
136+
{:torchx, "~> 0.5"} # PyTorch Backend
137+
]
138+
end
139+
140+
```
141+
142+
2. Fetch dependencies:
143+
144+
```sh
145+
mix deps.get
146+
```
147+
148+
3. Run with EXLA enabled:
149+
150+
```elixir
151+
Torchx.set_preferred_backend()
152+
```
153+
154+
Best for: Deep learning applications with PyTorch acceleration.
155+
156+
## Installing Nx with OpenBLAS for CPU Optimization
157+
158+
To optimize CPU performance with OpenBLAS:
159+
160+
1. Install OpenBLAS (libopenblas):
161+
- Ubuntu/Debian:
162+
```sh
163+
sudo apt install libopenblas-dev
164+
```
165+
- MacOS (using Homebrew):
166+
```sh
167+
brew install openblas
168+
```
169+
2. Modify mix.exs:
170+
171+
```elixir
172+
defp deps do
173+
[
174+
{:nx, "~> 0.5"},
175+
{:openblas, "~> 0.5"} # CPU-optimized BLAS backend
176+
]
177+
end
178+
```
179+
180+
3. Fetch dependencies:
181+
182+
```sh
183+
mix deps.get
184+
```
185+
186+
Best for: Optimizing CPU-based tensor computations.

nx/guides/getting_started/quickstart.livemd

Lines changed: 22 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -106,6 +106,8 @@ Names make your code more expressive:
106106
Nx.tensor([[[1, 2, 3], [4, 5, 6], [7, 8, 9]]], names: [:batch, :height, :width])
107107
```
108108

109+
We created a tensor of the shape `{3, 3}`, and two axes named `height` and `width`.
110+
109111
You can also leave dimension names as `nil` (which is the default):
110112

111113
```elixir
@@ -128,7 +130,7 @@ tensor = Nx.tensor([[1, 2], [3, 4]], names: [:y, :x])
128130
tensor[[0, 1]]
129131
```
130132

131-
Negative indices will start counting from the end of the axis.
133+
Negative indices will start counting from the end of the axis.
132134
`-1` is the last entry, `-2` the second to last and so on.
133135

134136
```elixir
@@ -167,6 +169,23 @@ Now,
167169
# ...your code here...
168170
```
169171

172+
### Tensor shape and reshape
173+
174+
```elixir
175+
Nx.shape(tensor)
176+
```
177+
178+
We can also create a new tensor with a new shape using `Nx.reshape/2`:
179+
180+
```elixir
181+
Nx.reshape(tensor, {1, 4}, names: [:batches, :values])
182+
```
183+
184+
This operation reuses all of the tensor data and simply
185+
changes the metadata, so it has no notable cost.
186+
187+
The new tensor has the same type, but a new shape.
188+
170189
### Floats and Complex numbers
171190

172191
Besides single-precision (32 bits), floats can have other kinds of precision, such as half-precision (16) or
@@ -177,13 +196,13 @@ Nx.tensor([0.0, 0.2, 0.4, 1.0], type: :f16)
177196
```
178197

179198
```elixir
180-
Nx.tensor([0.0, 0.2, 0.4, 1.0, type: :f64)
199+
Nx.tensor([0.0, 0.2, 0.4, 1.0], type: :f64)
181200
```
182201

183202
Brain floats are also supported:
184203

185204
```elixir
186-
Nx.tensor([0.0, 0.2, 0.4, 1.0, type: :bf16)
205+
Nx.tensor([0.0, 0.2, 0.4, 1.0], type: :bf16)
187206
```
188207

189208
Certain backends and compilers support 8-bit floats. The precision

nx/mix.exs

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -59,6 +59,7 @@ defmodule Nx.MixProject do
5959
"CHANGELOG.md",
6060
"guides/intro-to-nx.livemd",
6161
"guides/getting_started/introduction.md",
62+
"guides/getting_started/installation.md",
6263
"guides/getting_started/quickstart.livemd",
6364
"guides/advanced/vectorization.livemd",
6465
"guides/advanced/aggregation.livemd",
@@ -114,7 +115,7 @@ defmodule Nx.MixProject do
114115
]
115116
],
116117
groups_for_extras: [
117-
Getting_Started: ~r"^guides/getting_started/",
118+
"Getting Started": ~r"^guides/getting_started/",
118119
Exercises: ~r"^guides/exercises/",
119120
Advanced: ~r"^guides/advanced/"
120121
]

0 commit comments

Comments
 (0)