Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
20c9ac0
docs: add a Getting Started Section
TomasPegado Feb 3, 2025
ba529c6
Update introduction.md
polvalente Feb 4, 2025
0a97e25
Update quickstart.livemd
polvalente Feb 4, 2025
22aab8f
docs: add installation guide
TomasPegado Feb 10, 2025
ffe8caf
Fixed to_pointer/2 docs :kind -> :mode (#1578)
pejrich Feb 4, 2025
f252073
Refactor EXLA NIFs to use Fine (#1581)
jonatanklosko Feb 19, 2025
de77fd0
Change Nx.to_pointer/2 and Nx.from_pointer/5 to raise on errors (#1582)
jonatanklosko Feb 19, 2025
8955ff2
Lu matrix decomposition (#1587)
TomasPegado Mar 5, 2025
6f3b58e
docs: autodiff docs (#1580)
polvalente Mar 5, 2025
dacb96f
chore: hide Nx.LinAlg.LU module
polvalente Mar 5, 2025
b192288
Hide symbols from the NIF shared library (#1589)
jonatanklosko Mar 6, 2025
162d60b
feat(exla): take advantage of the new LU impl (#1590)
polvalente Mar 6, 2025
d5518fd
feat: allow explicitly disabling CUDA step (#1588)
polvalente Mar 6, 2025
6cc84d1
fix(exla): batched eigh (#1591)
polvalente Mar 12, 2025
6c33108
fix(exla): respect device id when automatic transfers are disabled (#…
polvalente Mar 13, 2025
b397939
test(exla): add more tests for LinAlg functions (#1594)
polvalente Mar 17, 2025
16f07f9
fix(exla): vectorized gather (#1595)
polvalente Mar 17, 2025
a48f578
feat: Nx.Defn.Graph (#1544)
polvalente Mar 18, 2025
a57c065
Fix(exla): triangular_solve with batched matrix input (#1596)
TomasPegado Mar 19, 2025
f4cedfd
Clarify composite docs
josevalim Mar 20, 2025
5e8abee
docs: getting started section
TomasPegado Mar 26, 2025
2eef046
docs: removing intro-to-nx.livemd
TomasPegado Mar 29, 2025
2129801
docs: removing intro-to-nx.livemd references
TomasPegado Mar 29, 2025
7905414
docs: improving deftransform example
TomasPegado Mar 29, 2025
2a7df18
Merge branch 'elixir-nx:main' into docs_refact
TomasPegado Mar 31, 2025
f4c2e66
chore: minor doc changes
polvalente Apr 2, 2025
28fb432
chore: more changes due to code review
polvalente Apr 2, 2025
fb879b5
chore: even more changes due to code review
polvalente Apr 2, 2025
b8f794e
chore: final set of more changes due to code review
polvalente Apr 2, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions exla/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Then you can add `EXLA` as dependency in your `mix.exs`:
```elixir
def deps do
[
{:exla, "~> 0.5"}
{:exla, "~> 0.9"}
]
end
```
Expand All @@ -26,7 +26,7 @@ If you are using Livebook or IEx, you can instead run:

```elixir
Mix.install([
{:exla, "~> 0.5"}
{:exla, "~> 0.9"}
])
```

Expand Down
2 changes: 1 addition & 1 deletion exla/guides/rotating-image.livemd
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Mix.install(
{:nx, github: "elixir-nx/nx", override: true, sparse: "nx"},
{:req, "~> 0.3.5"},
{:kino, "~> 0.8.1"},
{:exla, "~> 0.5"},
{:exla, "~> 0.9"},
{:stb_image, "~> 0.6"}
],
config: [
Expand Down
4 changes: 2 additions & 2 deletions nx/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ Then you can add `Nx` as dependency in your `mix.exs`:
```elixir
def deps do
[
{:nx, "~> 0.5"}
{:nx, "~> 0.9"}
]
end
```
Expand All @@ -68,7 +68,7 @@ If you are using Livebook or IEx, you can instead run:

```elixir
Mix.install([
{:nx, "~> 0.5"}
{:nx, "~> 0.9"}
])
```

Expand Down
8 changes: 2 additions & 6 deletions nx/guides/advanced/aggregation.livemd
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

```elixir
Mix.install([
{:nx, "~> 0.5"}
{:nx, "~> 0.9"}
])
```

Expand Down Expand Up @@ -93,7 +93,7 @@ m = ~MAT[
>
```

First, we'll compute the full-tensor aggregation. The calculations are developed below. We calculate an "array product" (aka [Hadamard product](https://en.wikipedia.org/wiki/Hadamard_product_(matrices)#:~:text=In%20mathematics%2C%20the%20Hadamard%20product,elements%20i%2C%20j%20of%20the), an element-wise product) of our tensor with the tensor of weights, then sum all the elements and divide by the sum of the weights.
First, we'll compute the full-tensor aggregation. The calculations are developed below. We calculate an "array product" (aka [Hadamard product](<https://en.wikipedia.org/wiki/Hadamard_product_(matrices)#:~:text=In%20mathematics%2C%20the%20Hadamard%20product,elements%20i%2C%20j%20of%20the>), an element-wise product) of our tensor with the tensor of weights, then sum all the elements and divide by the sum of the weights.

```elixir
w = ~MAT[
Expand Down Expand Up @@ -689,8 +689,6 @@ $$

<!-- livebook:{"break_markdown":true} -->



```elixir
Nx.argmax(t, axis: :z)
```
Expand Down Expand Up @@ -785,8 +783,6 @@ $$

<!-- livebook:{"break_markdown":true} -->



```elixir
Nx.argmin(t, axis: 3)
```
Expand Down
153 changes: 153 additions & 0 deletions nx/guides/getting_started/broadcasting.livemd
Original file line number Diff line number Diff line change
@@ -0,0 +1,153 @@
# Broadcasting

The dimensions of tensors in an operator don't always match.
For example, you might want to subtract a `1` from every
element of a `{2, 2}`-shaped tensor, like this:

$$
\begin{bmatrix}
1 & 2 \\\\
3 & 4
\end{bmatrix} - 1 =
\begin{bmatrix}
0 & 1 \\\\
2 & 3
\end{bmatrix}
$$

Mathematically, this is the same as:

$$
\begin{bmatrix}
1 & 2 \\\\
3 & 4
\end{bmatrix} -
\begin{bmatrix}
1 & 1 \\\\
1 & 1
\end{bmatrix} =
\begin{bmatrix}
0 & 1 \\\\
2 & 3
\end{bmatrix}
$$

This means we need a way to convert `1` to a `{2, 2}`-shaped tensor.
`Nx.broadcast/2` solves that problem. This function takes
a tensor or a scalar and a shape.

```elixir
Mix.install([
{:nx, "~> 0.9"}
])


Nx.broadcast(1, {2, 2})
```

This call takes the scalar `1` and translates it
to a compatible shape by copying it. Sometimes, it's easier
to provide a tensor as the second argument, and let `broadcast/2`
extract its shape:

```elixir
tensor = Nx.tensor([[1, 2], [3, 4]])
Nx.broadcast(1, tensor)
```

The code broadcasts `1` to the shape of `tensor`. In many operators
and functions, the broadcast happens automatically:

```elixir
Nx.subtract(tensor, 1)
```

This result is possible because Nx broadcasts _both tensors_
in `subtract/2` to compatible shapes. That means you can provide
scalar values as either argument:

```elixir
Nx.subtract(10, tensor)
```

Or subtract a row or column. Mathematically, it would look like this:

$$
\begin{bmatrix}
1 & 2 \\\\
3 & 4
\end{bmatrix} -
\begin{bmatrix}
1 & 2
\end{bmatrix} =
\begin{bmatrix}
0 & 0 \\\\
2 & 2
\end{bmatrix}
$$

which is the same as this:

$$
\begin{bmatrix}
1 & 2 \\\\
3 & 4
\end{bmatrix} -
\begin{bmatrix}
1 & 2 \\\\
1 & 2
\end{bmatrix} =
\begin{bmatrix}
0 & 0 \\\\
2 & 2
\end{bmatrix}
$$

This rewrite happens in Nx as well, through a broadcast operation. We want to
broadcast the tensor `[1, 2]` to match the `{2, 2}` shape:

```elixir
Nx.broadcast(Nx.tensor([1, 2]), {2, 2})
```

The `subtract` function in `Nx` takes care of that broadcast
implicitly, as discussed above:

```elixir
Nx.subtract(tensor, Nx.tensor([1, 2]))
```

The broadcast worked as expected, copying the `[1, 2]` row
enough times to fill a `{2, 2}`-shaped tensor. A tensor with a
dimension of `1` will broadcast to fill the tensor:

```elixir
[[1], [2]] |> Nx.tensor() |> Nx.broadcast({1, 2, 2})
```

```elixir
[[[1, 2, 3]]]
|> Nx.tensor()
|> Nx.broadcast({4, 2, 3})
```

Both of these examples copy parts of the tensor enough
times to fill out the broadcast shape. You can check out the
Nx broadcasting documentation for more details:

<!-- livebook:{"disable_formatting":true} -->

```elixir
h Nx.broadcast
```

Much of the time, you won't have to broadcast yourself. Many of
the functions and operators Nx supports will do so automatically.

We can use tensor-aware operators via various `Nx` functions and
many of them implicitly broadcast tensors.

Throughout this section, we have been invoking `Nx.subtract/2` and
our code would be more expressive if we could use its equivalent
mathematical operator. Fortunately, Nx provides a way. Next, we'll
dive into numerical definitions using `defn`.
150 changes: 150 additions & 0 deletions nx/guides/getting_started/installation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,150 @@
# Installation

The only prerequisite for installing Nx is Elixir itself. If you don't have Elixir installed
in your machine you can visit this [installation page](https://elixir-lang.org/install.html).

There are several ways to install Nx (Numerical Elixir), depending on your project type and needs.

## Using Mix in a standardElixir Project
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
## Using Mix in a standardElixir Project
## Using Nx in a Standard Elixir Project


If you are working inside a Mix project, the recommended way to install Nx is by adding it to your mix.exs dependencies:

1. Open mix.exs and modify the deps function:

```elixir
defp deps do
[
{:nx, "~> 0.9"} # Install the latest stable version
]
end
```

2. Fetch the dependencies, run on the terminal:

```sh
mix deps.get
```

## Installing Nx from GitHub (Latest Development Version)

If you need the latest, unreleased features, install Nx directly from the GitHub repository.

1. Modify `mix.exs`:

```elixir
defp deps do
[
{:nx, github: "elixir-nx/nx", branch: "main", sparse: "nx"}
]
end
```

2. Fetch dependencies:

```sh
mix deps.get
```

## Installing Nx in a Standalone Script (Without a Mix Project)

If you don’t have a Mix project and just want to run a standalone script, use Mix.install/1 to dynamically fetch and install Nx.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
If you don’t have a Mix project and just want to run a standalone script, use Mix.install/1 to dynamically fetch and install Nx.
If you don’t have a Mix project and just want to run a standalone script, use `Mix.install/1` to dynamically fetch and install Nx.


```elixir
Mix.install([:nx])

require Nx

tensor = Nx.tensor([1, 2, 3])
IO.inspect(tensor)
```

Run the script with:

```sh
elixir my_script.exs
```

Best for: Quick experiments, small scripts, or one-off computations.

## Installing the Latest Nx from GitHub in a Standalone Script

To use the latest development version in a script (without a Mix project):

```elixir
Mix.install([
{:nx, github: "elixir-nx/nx", branch: "main", sparse: "nx"}
])

require Nx

tensor = Nx.tensor([1, 2, 3])
IO.inspect(tensor)
```

Run:

```sh
elixir my_script.exs
```

Best for: Trying new features from Nx without creating a full project.

## Installing Nx with EXLA for GPU Acceleration

To enable GPU/TPU acceleration with Google’s XLA backend, install Nx along with EXLA:

1. Modify mix.exs:

```elixir
defp deps do
[
{:nx, "~> 0.9"},
{:exla, "~> 0.9"} # EXLA (Google XLA Backend)
]
end
```

2. Fetch dependencies:

```sh
mix deps.get
```

3. Run with EXLA enabled:

```elixir
Nx.default_backend(EXLA.Backend)
Nx.Defn.default_options(compiler: EXLA)
```

Best for: Running Nx on GPUs or TPUs using Google’s XLA compiler.

## Installing Nx with Torchx for PyTorch Acceleration

To run Nx operations on PyTorch’s backend (LibTorch):

1. Modify mix.exs:

```elixir
defp deps do
[
{:nx, "~> 0.9"},
{:torchx, "~> 0.9"} # PyTorch Backend
]
end

```

2. Fetch dependencies:

```sh
mix deps.get
```

3. Run with Torchx enabled:

```elixir
Nx.default_backend(Torchx.Backend)
```

Best for: Deep learning applications with PyTorch acceleration.
Loading
Loading