Skip to content
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
20c9ac0
docs: add a Getting Started Section
TomasPegado Feb 3, 2025
ba529c6
Update introduction.md
polvalente Feb 4, 2025
0a97e25
Update quickstart.livemd
polvalente Feb 4, 2025
22aab8f
docs: add installation guide
TomasPegado Feb 10, 2025
ffe8caf
Fixed to_pointer/2 docs :kind -> :mode (#1578)
pejrich Feb 4, 2025
f252073
Refactor EXLA NIFs to use Fine (#1581)
jonatanklosko Feb 19, 2025
de77fd0
Change Nx.to_pointer/2 and Nx.from_pointer/5 to raise on errors (#1582)
jonatanklosko Feb 19, 2025
8955ff2
Lu matrix decomposition (#1587)
TomasPegado Mar 5, 2025
6f3b58e
docs: autodiff docs (#1580)
polvalente Mar 5, 2025
dacb96f
chore: hide Nx.LinAlg.LU module
polvalente Mar 5, 2025
b192288
Hide symbols from the NIF shared library (#1589)
jonatanklosko Mar 6, 2025
162d60b
feat(exla): take advantage of the new LU impl (#1590)
polvalente Mar 6, 2025
d5518fd
feat: allow explicitly disabling CUDA step (#1588)
polvalente Mar 6, 2025
6cc84d1
fix(exla): batched eigh (#1591)
polvalente Mar 12, 2025
6c33108
fix(exla): respect device id when automatic transfers are disabled (#…
polvalente Mar 13, 2025
b397939
test(exla): add more tests for LinAlg functions (#1594)
polvalente Mar 17, 2025
16f07f9
fix(exla): vectorized gather (#1595)
polvalente Mar 17, 2025
a48f578
feat: Nx.Defn.Graph (#1544)
polvalente Mar 18, 2025
a57c065
Fix(exla): triangular_solve with batched matrix input (#1596)
TomasPegado Mar 19, 2025
f4cedfd
Clarify composite docs
josevalim Mar 20, 2025
5e8abee
docs: getting started section
TomasPegado Mar 26, 2025
2eef046
docs: removing intro-to-nx.livemd
TomasPegado Mar 29, 2025
2129801
docs: removing intro-to-nx.livemd references
TomasPegado Mar 29, 2025
7905414
docs: improving deftransform example
TomasPegado Mar 29, 2025
2a7df18
Merge branch 'elixir-nx:main' into docs_refact
TomasPegado Mar 31, 2025
f4c2e66
chore: minor doc changes
polvalente Apr 2, 2025
28fb432
chore: more changes due to code review
polvalente Apr 2, 2025
fb879b5
chore: even more changes due to code review
polvalente Apr 2, 2025
b8f794e
chore: final set of more changes due to code review
polvalente Apr 2, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
82 changes: 82 additions & 0 deletions nx/guides/getting_started/introduction.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,82 @@
# What is Nx?
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@josevalim This is the start of our revamped docs.

We're taking the current getting started guide and both splitting it and getting into more detail.
I'm thinking we should merge these onto a new-docs branch and only merge that onto main after the getting started is fully done.

WDYT?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it is fine to push to main directly, given the plan is for continuous work on it, right?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We'll make it so that this PR is merged when it fully replaces the previous introduction to nx guide.


Nx is the numerical computing library of Elixir. Since Elixir's primary numerical datatypes and structures are not optimized for numerical programming, Nx is the fundamental package built to bridge this gap.

[Elixir Nx](https://github.com/elixir-nx/nx) smoothly integrates typed, multidimensional data called [tensors](introduction.html#what-are-tensors)).
Nx has four primary capabilities:

- In Nx, tensors hold typed data in multiple, optionally named dimensions.
- Numerical definitions, known as `defn`, support custom code with
tensor-aware operators and functions.
- [Automatic differentiation](https://arxiv.org/abs/1502.05767), also known as
autograd or autodiff, supports common computational scenarios
such as machine learning, simulations, curve fitting, and probabilistic models.
- Broadcasting, which is a term for element-by-element operations. Most of the Nx operations
make use of automatic implicit broadcasting. You can see more on broadcasting
[here.](intro-to-nx.html#broadcasts)

Nx tensors can hold unsigned integers (u2, u4, u8, u16, u32, u64),
signed integers (s2, s4, s8, s16, s32, s64),
floats (f8, f16, f32, f64), brain floats (bf16), and complex (c64, c128).
Tensors support backends implemented outside of Elixir, such as Google's
Accelerated Linear Algebra (XLA) and PyTorch.

Numerical definitions provide compiler support to allow just-in-time compilation
targetting specialized processors to speed up numeric computation including
TPUs and GPUs.

## What are Tensors?

In Nx, we express multi-dimensional data using typed tensors. Simply put,
a tensor is a multi-dimensional array with a predetermined shape and
type. To interact with them, Nx relies on tensor-aware operators rather
than `Enum.map/2` and `Enum.reduce/3`.

It allows us to work with the central theme in numerical computing, systems of equations,
which are often expressed and solved with multidimensional arrays.

For example, this is a two dimensional array:

$$
\begin{bmatrix}
1 & 2 \\\\
3 & 4
\end{bmatrix}
$$

As elixir programmers, we can typically express a similar data structure using a list of lists,
like this:

```elixir
[
[1, 2],
[3, 4]
]
```

This data structure works fine within many functional programming
algorithms, but breaks down with deep nesting and random access.

On top of that, Elixir numeric types lack optimization for many numerical
applications. They work fine when programs
need hundreds or even thousands of calculations. However, they tend to break
down with traditional STEM applications when a typical problem
needs millions of calculations.

To solve for this, we can simply use Nx tensors, for example:

```elixir
Nx.tensor([[1,2],[3,4]])

Output:
#Nx.Tensor<
s32[2][2]
[
[1, 2],
[3, 4]
]
```

To learn Nx, we'll get to know tensors first. The following overview will touch
on the major features. The advanced section of the documentation will take a deep dive into working
with tensors in detail, autodiff, and backends.
Loading