Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
20c9ac0
docs: add a Getting Started Section
TomasPegado Feb 3, 2025
ba529c6
Update introduction.md
polvalente Feb 4, 2025
0a97e25
Update quickstart.livemd
polvalente Feb 4, 2025
22aab8f
docs: add installation guide
TomasPegado Feb 10, 2025
ffe8caf
Fixed to_pointer/2 docs :kind -> :mode (#1578)
pejrich Feb 4, 2025
f252073
Refactor EXLA NIFs to use Fine (#1581)
jonatanklosko Feb 19, 2025
de77fd0
Change Nx.to_pointer/2 and Nx.from_pointer/5 to raise on errors (#1582)
jonatanklosko Feb 19, 2025
8955ff2
Lu matrix decomposition (#1587)
TomasPegado Mar 5, 2025
6f3b58e
docs: autodiff docs (#1580)
polvalente Mar 5, 2025
dacb96f
chore: hide Nx.LinAlg.LU module
polvalente Mar 5, 2025
b192288
Hide symbols from the NIF shared library (#1589)
jonatanklosko Mar 6, 2025
162d60b
feat(exla): take advantage of the new LU impl (#1590)
polvalente Mar 6, 2025
d5518fd
feat: allow explicitly disabling CUDA step (#1588)
polvalente Mar 6, 2025
6cc84d1
fix(exla): batched eigh (#1591)
polvalente Mar 12, 2025
6c33108
fix(exla): respect device id when automatic transfers are disabled (#…
polvalente Mar 13, 2025
b397939
test(exla): add more tests for LinAlg functions (#1594)
polvalente Mar 17, 2025
16f07f9
fix(exla): vectorized gather (#1595)
polvalente Mar 17, 2025
a48f578
feat: Nx.Defn.Graph (#1544)
polvalente Mar 18, 2025
a57c065
Fix(exla): triangular_solve with batched matrix input (#1596)
TomasPegado Mar 19, 2025
f4cedfd
Clarify composite docs
josevalim Mar 20, 2025
5e8abee
docs: getting started section
TomasPegado Mar 26, 2025
2eef046
docs: removing intro-to-nx.livemd
TomasPegado Mar 29, 2025
2129801
docs: removing intro-to-nx.livemd references
TomasPegado Mar 29, 2025
7905414
docs: improving deftransform example
TomasPegado Mar 29, 2025
2a7df18
Merge branch 'elixir-nx:main' into docs_refact
TomasPegado Mar 31, 2025
f4c2e66
chore: minor doc changes
polvalente Apr 2, 2025
28fb432
chore: more changes due to code review
polvalente Apr 2, 2025
fb879b5
chore: even more changes due to code review
polvalente Apr 2, 2025
b8f794e
chore: final set of more changes due to code review
polvalente Apr 2, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
186 changes: 186 additions & 0 deletions nx/guides/getting_started/installation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,186 @@
# Installation

The only prerequisite for installing Nx is Elixir itself. If you don´t have Elixir installed
in your machine you can visit this [intallation page](https://elixir-lang.org/install.html).

There are several ways to install Nx (Numerical Elixir), depending on your project type and needs.

## Using Mix in a standardElixir Project
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
## Using Mix in a standardElixir Project
## Using Nx in a Standard Elixir Project


If you are working inside a Mix project, the recommended way to install Nx is by adding it to your mix.exs dependencies:

1. Open mix.exs and modify the deps function:

```elixir
defp deps do
[
{:nx, "~> 0.5"} # Install the latest stable version
]
end
```

2. Fetch the dependencies, run on the terminal:

```sh
mix deps.get
```

## Installing Nx from GitHub (Latest Development Version)

If you need the latest, unreleased features, install Nx directly from the GitHub repository.

1. Modify mix.exs:

```elixir
defp deps do
[
{:nx, github: "elixir-nx/nx", branch: "main"}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
{:nx, github: "elixir-nx/nx", branch: "main"}
{:nx, github: "elixir-nx/nx", branch: "main", sparse: "nx"}

]
end

```

2. Fetch dependencies:

```sh
mix deps.get

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change

```

## Installing Nx in a Standalone Script (Without a Mix Project)

If you don’t have a Mix project and just want to run a standalone script, use Mix.install/1 to dynamically fetch and install Nx.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
If you don’t have a Mix project and just want to run a standalone script, use Mix.install/1 to dynamically fetch and install Nx.
If you don’t have a Mix project and just want to run a standalone script, use `Mix.install/1` to dynamically fetch and install Nx.


```elixir
Mix.install([:nx])

require Nx

tensor = Nx.tensor([1, 2, 3])
IO.inspect(tensor)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
IO.inspect(tensor)
IO.inspect(tensor)

```

Run the script with:

```sh
elixir my_script.exs

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change

```

Best for: Quick experiments, small scripts, or one-off computations.

## Installing the Latest Nx from GitHub in a Standalone Script

To use the latest development version in a script (without a Mix project):

```elixir
Mix.install([
{:nx, github: "elixir-nx/nx", branch: "main"}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
{:nx, github: "elixir-nx/nx", branch: "main"}
{:nx, github: "elixir-nx/nx", branch: "main", sparse: "nx"}

])

require Nx

tensor = Nx.tensor([1, 2, 3])
IO.inspect(tensor)
```

Run:

```sh
elixir my_script.exs

```

Best for: Trying new features from Nx without creating a full project.

## Installing Nx with EXLA for GPU Acceleration

To enable GPU/TPU acceleration with Google’s XLA backend, install Nx along with EXLA:

1. Modify mix.exs:

```elixir
defp deps do
[
{:nx, "~> 0.5"},
{:exla, "~> 0.5"} # EXLA (Google XLA Backend)
]
end
```

2. Fetch dependencies:

```sh
mix deps.get
```

3. Run with EXLA enabled:

```elixir
EXLA.set_preferred_backend(:tpu)
```

Best for: Running Nx on GPUs or TPUs using Google’s XLA compiler.

## Installing Nx with Torchx for PyTorch Acceleration

To run Nx operations on PyTorch’s backend (LibTorch):

1. Modify mix.exs:

```elixir
defp deps do
[
{:nx, "~> 0.5"},
{:torchx, "~> 0.5"} # PyTorch Backend
]
end

```
Comment on lines +123 to +136
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
Best for: Running Nx on GPUs or TPUs using Google’s XLA compiler.
## Installing Nx with Torchx for PyTorch Acceleration
To run Nx operations on PyTorch’s backend (LibTorch):
1. Modify mix.exs:
```elixir
defp deps do
[
{:nx, "~> 0.5"},
{:torchx, "~> 0.5"} # PyTorch Backend
]
end
```

Let's not mention Torchx. We could maybe reference EMLX, but it's not even released yet, so let's leave this for later.


2. Fetch dependencies:

```sh
mix deps.get
```

3. Run with EXLA enabled:

```elixir
Torchx.set_preferred_backend()
```

Best for: Deep learning applications with PyTorch acceleration.

## Installing Nx with OpenBLAS for CPU Optimization

To optimize CPU performance with OpenBLAS:

1. Install OpenBLAS (libopenblas):
- Ubuntu/Debian:
```sh
sudo apt install libopenblas-dev
```
- MacOS (using Homebrew):
```sh
brew install openblas
```
2. Modify mix.exs:

```elixir
defp deps do
[
{:nx, "~> 0.5"},
{:openblas, "~> 0.5"} # CPU-optimized BLAS backend
]
end
```

3. Fetch dependencies:

```sh
mix deps.get
```

Best for: Optimizing CPU-based tensor computations.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
2. Fetch dependencies:
```sh
mix deps.get
```
3. Run with EXLA enabled:
```elixir
Torchx.set_preferred_backend()
```
Best for: Deep learning applications with PyTorch acceleration.
## Installing Nx with OpenBLAS for CPU Optimization
To optimize CPU performance with OpenBLAS:
1. Install OpenBLAS (libopenblas):
- Ubuntu/Debian:
```sh
sudo apt install libopenblas-dev
```
- MacOS (using Homebrew):
```sh
brew install openblas
```
2. Modify mix.exs:
```elixir
defp deps do
[
{:nx, "~> 0.5"},
{:openblas, "~> 0.5"} # CPU-optimized BLAS backend
]
end
```
3. Fetch dependencies:
```sh
mix deps.get
```
Best for: Optimizing CPU-based tensor computations.

25 changes: 22 additions & 3 deletions nx/guides/getting_started/quickstart.livemd
Original file line number Diff line number Diff line change
Expand Up @@ -106,6 +106,8 @@ Names make your code more expressive:
Nx.tensor([[[1, 2, 3], [4, 5, 6], [7, 8, 9]]], names: [:batch, :height, :width])
```

We created a tensor of the shape `{3, 3}`, and two axes named `height` and `width`.

You can also leave dimension names as `nil` (which is the default):

```elixir
Expand All @@ -128,7 +130,7 @@ tensor = Nx.tensor([[1, 2], [3, 4]], names: [:y, :x])
tensor[[0, 1]]
```

Negative indices will start counting from the end of the axis.
Negative indices will start counting from the end of the axis.
`-1` is the last entry, `-2` the second to last and so on.

```elixir
Expand Down Expand Up @@ -167,6 +169,23 @@ Now,
# ...your code here...
```

### Tensor shape and reshape

```elixir
Nx.shape(tensor)
```

We can also create a new tensor with a new shape using `Nx.reshape/2`:

```elixir
Nx.reshape(tensor, {1, 4}, names: [:batches, :values])
```

This operation reuses all of the tensor data and simply
changes the metadata, so it has no notable cost.

The new tensor has the same type, but a new shape.

### Floats and Complex numbers

Besides single-precision (32 bits), floats can have other kinds of precision, such as half-precision (16) or
Expand All @@ -177,13 +196,13 @@ Nx.tensor([0.0, 0.2, 0.4, 1.0], type: :f16)
```

```elixir
Nx.tensor([0.0, 0.2, 0.4, 1.0, type: :f64)
Nx.tensor([0.0, 0.2, 0.4, 1.0], type: :f64)
```

Brain floats are also supported:

```elixir
Nx.tensor([0.0, 0.2, 0.4, 1.0, type: :bf16)
Nx.tensor([0.0, 0.2, 0.4, 1.0], type: :bf16)
```

Certain backends and compilers support 8-bit floats. The precision
Expand Down
3 changes: 2 additions & 1 deletion nx/mix.exs
Original file line number Diff line number Diff line change
Expand Up @@ -59,6 +59,7 @@ defmodule Nx.MixProject do
"CHANGELOG.md",
"guides/intro-to-nx.livemd",
"guides/getting_started/introduction.md",
"guides/getting_started/installation.md",
"guides/getting_started/quickstart.livemd",
"guides/advanced/vectorization.livemd",
"guides/advanced/aggregation.livemd",
Expand Down Expand Up @@ -114,7 +115,7 @@ defmodule Nx.MixProject do
]
],
groups_for_extras: [
Getting_Started: ~r"^guides/getting_started/",
"Getting Started": ~r"^guides/getting_started/",
Exercises: ~r"^guides/exercises/",
Advanced: ~r"^guides/advanced/"
]
Expand Down