Skip to content

Commit f9428e6

Browse files
docs: add a Getting Started Section (#1577)
Co-authored-by: Paulo Valente <16843419+polvalente@users.noreply.github.com>
1 parent 1980af9 commit f9428e6

File tree

13 files changed

+979
-657
lines changed

13 files changed

+979
-657
lines changed

exla/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ Then you can add `EXLA` as dependency in your `mix.exs`:
1717
```elixir
1818
def deps do
1919
[
20-
{:exla, "~> 0.5"}
20+
{:exla, "~> 0.9"}
2121
]
2222
end
2323
```
@@ -26,7 +26,7 @@ If you are using Livebook or IEx, you can instead run:
2626

2727
```elixir
2828
Mix.install([
29-
{:exla, "~> 0.5"}
29+
{:exla, "~> 0.9"}
3030
])
3131
```
3232

exla/guides/rotating-image.livemd

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ Mix.install(
66
{:nx, github: "elixir-nx/nx", override: true, sparse: "nx"},
77
{:req, "~> 0.3.5"},
88
{:kino, "~> 0.8.1"},
9-
{:exla, "~> 0.5"},
9+
{:exla, "~> 0.9"},
1010
{:stb_image, "~> 0.6"}
1111
],
1212
config: [

nx/README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -59,7 +59,7 @@ Then you can add `Nx` as dependency in your `mix.exs`:
5959
```elixir
6060
def deps do
6161
[
62-
{:nx, "~> 0.5"}
62+
{:nx, "~> 0.9"}
6363
]
6464
end
6565
```
@@ -68,7 +68,7 @@ If you are using Livebook or IEx, you can instead run:
6868

6969
```elixir
7070
Mix.install([
71-
{:nx, "~> 0.5"}
71+
{:nx, "~> 0.9"}
7272
])
7373
```
7474

nx/guides/advanced/aggregation.livemd

Lines changed: 2 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44

55
```elixir
66
Mix.install([
7-
{:nx, "~> 0.5"}
7+
{:nx, "~> 0.9"}
88
])
99
```
1010

@@ -93,7 +93,7 @@ m = ~MAT[
9393
>
9494
```
9595

96-
First, we'll compute the full-tensor aggregation. The calculations are developed below. We calculate an "array product" (aka [Hadamard product](https://en.wikipedia.org/wiki/Hadamard_product_(matrices)#:~:text=In%20mathematics%2C%20the%20Hadamard%20product,elements%20i%2C%20j%20of%20the), an element-wise product) of our tensor with the tensor of weights, then sum all the elements and divide by the sum of the weights.
96+
First, we'll compute the full-tensor aggregation. The calculations are developed below. We calculate an "array product" (aka [Hadamard product](<https://en.wikipedia.org/wiki/Hadamard_product_(matrices)#:~:text=In%20mathematics%2C%20the%20Hadamard%20product,elements%20i%2C%20j%20of%20the>), an element-wise product) of our tensor with the tensor of weights, then sum all the elements and divide by the sum of the weights.
9797

9898
```elixir
9999
w = ~MAT[
@@ -689,8 +689,6 @@ $$
689689

690690
<!-- livebook:{"break_markdown":true} -->
691691

692-
693-
694692
```elixir
695693
Nx.argmax(t, axis: :z)
696694
```
@@ -785,8 +783,6 @@ $$
785783

786784
<!-- livebook:{"break_markdown":true} -->
787785

788-
789-
790786
```elixir
791787
Nx.argmin(t, axis: 3)
792788
```
Lines changed: 153 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,153 @@
1+
# Broadcasting
2+
3+
The dimensions of tensors in an operator don't always match.
4+
For example, you might want to subtract a `1` from every
5+
element of a `{2, 2}`-shaped tensor, like this:
6+
7+
$$
8+
\begin{bmatrix}
9+
1 & 2 \\\\
10+
3 & 4
11+
\end{bmatrix} - 1 =
12+
\begin{bmatrix}
13+
0 & 1 \\\\
14+
2 & 3
15+
\end{bmatrix}
16+
$$
17+
18+
Mathematically, this is the same as:
19+
20+
$$
21+
\begin{bmatrix}
22+
1 & 2 \\\\
23+
3 & 4
24+
\end{bmatrix} -
25+
\begin{bmatrix}
26+
1 & 1 \\\\
27+
1 & 1
28+
\end{bmatrix} =
29+
\begin{bmatrix}
30+
0 & 1 \\\\
31+
2 & 3
32+
\end{bmatrix}
33+
$$
34+
35+
This means we need a way to convert `1` to a `{2, 2}`-shaped tensor.
36+
`Nx.broadcast/2` solves that problem. This function takes
37+
a tensor or a scalar and a shape.
38+
39+
```elixir
40+
Mix.install([
41+
{:nx, "~> 0.9"}
42+
])
43+
44+
45+
Nx.broadcast(1, {2, 2})
46+
```
47+
48+
This call takes the scalar `1` and translates it
49+
to a compatible shape by copying it. Sometimes, it's easier
50+
to provide a tensor as the second argument, and let `broadcast/2`
51+
extract its shape:
52+
53+
```elixir
54+
tensor = Nx.tensor([[1, 2], [3, 4]])
55+
Nx.broadcast(1, tensor)
56+
```
57+
58+
The code broadcasts `1` to the shape of `tensor`. In many operators
59+
and functions, the broadcast happens automatically:
60+
61+
```elixir
62+
Nx.subtract(tensor, 1)
63+
```
64+
65+
This result is possible because Nx broadcasts _both tensors_
66+
in `subtract/2` to compatible shapes. That means you can provide
67+
scalar values as either argument:
68+
69+
```elixir
70+
Nx.subtract(10, tensor)
71+
```
72+
73+
Or subtract a row or column. Mathematically, it would look like this:
74+
75+
$$
76+
\begin{bmatrix}
77+
1 & 2 \\\\
78+
3 & 4
79+
\end{bmatrix} -
80+
\begin{bmatrix}
81+
1 & 2
82+
\end{bmatrix} =
83+
\begin{bmatrix}
84+
0 & 0 \\\\
85+
2 & 2
86+
\end{bmatrix}
87+
$$
88+
89+
which is the same as this:
90+
91+
$$
92+
\begin{bmatrix}
93+
1 & 2 \\\\
94+
3 & 4
95+
\end{bmatrix} -
96+
\begin{bmatrix}
97+
1 & 2 \\\\
98+
1 & 2
99+
\end{bmatrix} =
100+
\begin{bmatrix}
101+
0 & 0 \\\\
102+
2 & 2
103+
\end{bmatrix}
104+
$$
105+
106+
This rewrite happens in Nx as well, through a broadcast operation. We want to
107+
broadcast the tensor `[1, 2]` to match the `{2, 2}` shape:
108+
109+
```elixir
110+
Nx.broadcast(Nx.tensor([1, 2]), {2, 2})
111+
```
112+
113+
The `subtract` function in `Nx` takes care of that broadcast
114+
implicitly, as discussed above:
115+
116+
```elixir
117+
Nx.subtract(tensor, Nx.tensor([1, 2]))
118+
```
119+
120+
The broadcast worked as expected, copying the `[1, 2]` row
121+
enough times to fill a `{2, 2}`-shaped tensor. A tensor with a
122+
dimension of `1` will broadcast to fill the tensor:
123+
124+
```elixir
125+
[[1], [2]] |> Nx.tensor() |> Nx.broadcast({1, 2, 2})
126+
```
127+
128+
```elixir
129+
[[[1, 2, 3]]]
130+
|> Nx.tensor()
131+
|> Nx.broadcast({4, 2, 3})
132+
```
133+
134+
Both of these examples copy parts of the tensor enough
135+
times to fill out the broadcast shape. You can check out the
136+
Nx broadcasting documentation for more details:
137+
138+
<!-- livebook:{"disable_formatting":true} -->
139+
140+
```elixir
141+
h Nx.broadcast
142+
```
143+
144+
Much of the time, you won't have to broadcast yourself. Many of
145+
the functions and operators Nx supports will do so automatically.
146+
147+
We can use tensor-aware operators via various `Nx` functions and
148+
many of them implicitly broadcast tensors.
149+
150+
Throughout this section, we have been invoking `Nx.subtract/2` and
151+
our code would be more expressive if we could use its equivalent
152+
mathematical operator. Fortunately, Nx provides a way. Next, we'll
153+
dive into numerical definitions using `defn`.
Lines changed: 150 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,150 @@
1+
# Installation
2+
3+
The only prerequisite for installing Nx is Elixir itself. If you don't have Elixir installed
4+
in your machine you can visit this [installation page](https://elixir-lang.org/install.html).
5+
6+
There are several ways to install Nx (Numerical Elixir), depending on your project type and needs.
7+
8+
## Using Mix in a standardElixir Project
9+
10+
If you are working inside a Mix project, the recommended way to install Nx is by adding it to your mix.exs dependencies:
11+
12+
1. Open mix.exs and modify the deps function:
13+
14+
```elixir
15+
defp deps do
16+
[
17+
{:nx, "~> 0.9"} # Install the latest stable version
18+
]
19+
end
20+
```
21+
22+
2. Fetch the dependencies, run on the terminal:
23+
24+
```sh
25+
mix deps.get
26+
```
27+
28+
## Installing Nx from GitHub (Latest Development Version)
29+
30+
If you need the latest, unreleased features, install Nx directly from the GitHub repository.
31+
32+
1. Modify `mix.exs`:
33+
34+
```elixir
35+
defp deps do
36+
[
37+
{:nx, github: "elixir-nx/nx", branch: "main", sparse: "nx"}
38+
]
39+
end
40+
```
41+
42+
2. Fetch dependencies:
43+
44+
```sh
45+
mix deps.get
46+
```
47+
48+
## Installing Nx in a Standalone Script (Without a Mix Project)
49+
50+
If you don’t have a Mix project and just want to run a standalone script, use Mix.install/1 to dynamically fetch and install Nx.
51+
52+
```elixir
53+
Mix.install([:nx])
54+
55+
require Nx
56+
57+
tensor = Nx.tensor([1, 2, 3])
58+
IO.inspect(tensor)
59+
```
60+
61+
Run the script with:
62+
63+
```sh
64+
elixir my_script.exs
65+
```
66+
67+
Best for: Quick experiments, small scripts, or one-off computations.
68+
69+
## Installing the Latest Nx from GitHub in a Standalone Script
70+
71+
To use the latest development version in a script (without a Mix project):
72+
73+
```elixir
74+
Mix.install([
75+
{:nx, github: "elixir-nx/nx", branch: "main", sparse: "nx"}
76+
])
77+
78+
require Nx
79+
80+
tensor = Nx.tensor([1, 2, 3])
81+
IO.inspect(tensor)
82+
```
83+
84+
Run:
85+
86+
```sh
87+
elixir my_script.exs
88+
```
89+
90+
Best for: Trying new features from Nx without creating a full project.
91+
92+
## Installing Nx with EXLA for GPU Acceleration
93+
94+
To enable GPU/TPU acceleration with Google’s XLA backend, install Nx along with EXLA:
95+
96+
1. Modify mix.exs:
97+
98+
```elixir
99+
defp deps do
100+
[
101+
{:nx, "~> 0.9"},
102+
{:exla, "~> 0.9"} # EXLA (Google XLA Backend)
103+
]
104+
end
105+
```
106+
107+
2. Fetch dependencies:
108+
109+
```sh
110+
mix deps.get
111+
```
112+
113+
3. Run with EXLA enabled:
114+
115+
```elixir
116+
Nx.default_backend(EXLA.Backend)
117+
Nx.Defn.default_options(compiler: EXLA)
118+
```
119+
120+
Best for: Running Nx on GPUs or TPUs using Google’s XLA compiler.
121+
122+
## Installing Nx with Torchx for PyTorch Acceleration
123+
124+
To run Nx operations on PyTorch’s backend (LibTorch):
125+
126+
1. Modify mix.exs:
127+
128+
```elixir
129+
defp deps do
130+
[
131+
{:nx, "~> 0.9"},
132+
{:torchx, "~> 0.9"} # PyTorch Backend
133+
]
134+
end
135+
136+
```
137+
138+
2. Fetch dependencies:
139+
140+
```sh
141+
mix deps.get
142+
```
143+
144+
3. Run with Torchx enabled:
145+
146+
```elixir
147+
Nx.default_backend(Torchx.Backend)
148+
```
149+
150+
Best for: Deep learning applications with PyTorch acceleration.

0 commit comments

Comments
 (0)