-
Notifications
You must be signed in to change notification settings - Fork 217
docs: add a Getting Started Section #1577
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from all commits
Commits
Show all changes
29 commits
Select commit
Hold shift + click to select a range
20c9ac0
docs: add a Getting Started Section
TomasPegado ba529c6
Update introduction.md
polvalente 0a97e25
Update quickstart.livemd
polvalente 22aab8f
docs: add installation guide
TomasPegado ffe8caf
Fixed to_pointer/2 docs :kind -> :mode (#1578)
pejrich f252073
Refactor EXLA NIFs to use Fine (#1581)
jonatanklosko de77fd0
Change Nx.to_pointer/2 and Nx.from_pointer/5 to raise on errors (#1582)
jonatanklosko 8955ff2
Lu matrix decomposition (#1587)
TomasPegado 6f3b58e
docs: autodiff docs (#1580)
polvalente dacb96f
chore: hide Nx.LinAlg.LU module
polvalente b192288
Hide symbols from the NIF shared library (#1589)
jonatanklosko 162d60b
feat(exla): take advantage of the new LU impl (#1590)
polvalente d5518fd
feat: allow explicitly disabling CUDA step (#1588)
polvalente 6cc84d1
fix(exla): batched eigh (#1591)
polvalente 6c33108
fix(exla): respect device id when automatic transfers are disabled (#…
polvalente b397939
test(exla): add more tests for LinAlg functions (#1594)
polvalente 16f07f9
fix(exla): vectorized gather (#1595)
polvalente a48f578
feat: Nx.Defn.Graph (#1544)
polvalente a57c065
Fix(exla): triangular_solve with batched matrix input (#1596)
TomasPegado f4cedfd
Clarify composite docs
josevalim 5e8abee
docs: getting started section
TomasPegado 2eef046
docs: removing intro-to-nx.livemd
TomasPegado 2129801
docs: removing intro-to-nx.livemd references
TomasPegado 7905414
docs: improving deftransform example
TomasPegado 2a7df18
Merge branch 'elixir-nx:main' into docs_refact
TomasPegado f4c2e66
chore: minor doc changes
polvalente 28fb432
chore: more changes due to code review
polvalente fb879b5
chore: even more changes due to code review
polvalente b8f794e
chore: final set of more changes due to code review
polvalente File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,153 @@ | ||
| # Broadcasting | ||
|
|
||
| The dimensions of tensors in an operator don't always match. | ||
| For example, you might want to subtract a `1` from every | ||
| element of a `{2, 2}`-shaped tensor, like this: | ||
|
|
||
| $$ | ||
| \begin{bmatrix} | ||
| 1 & 2 \\\\ | ||
| 3 & 4 | ||
| \end{bmatrix} - 1 = | ||
| \begin{bmatrix} | ||
| 0 & 1 \\\\ | ||
| 2 & 3 | ||
| \end{bmatrix} | ||
| $$ | ||
|
|
||
| Mathematically, this is the same as: | ||
|
|
||
| $$ | ||
| \begin{bmatrix} | ||
| 1 & 2 \\\\ | ||
| 3 & 4 | ||
| \end{bmatrix} - | ||
| \begin{bmatrix} | ||
| 1 & 1 \\\\ | ||
| 1 & 1 | ||
| \end{bmatrix} = | ||
| \begin{bmatrix} | ||
| 0 & 1 \\\\ | ||
| 2 & 3 | ||
| \end{bmatrix} | ||
| $$ | ||
|
|
||
| This means we need a way to convert `1` to a `{2, 2}`-shaped tensor. | ||
| `Nx.broadcast/2` solves that problem. This function takes | ||
| a tensor or a scalar and a shape. | ||
|
|
||
| ```elixir | ||
| Mix.install([ | ||
| {:nx, "~> 0.9"} | ||
| ]) | ||
|
|
||
|
|
||
| Nx.broadcast(1, {2, 2}) | ||
| ``` | ||
|
|
||
| This call takes the scalar `1` and translates it | ||
| to a compatible shape by copying it. Sometimes, it's easier | ||
| to provide a tensor as the second argument, and let `broadcast/2` | ||
| extract its shape: | ||
|
|
||
| ```elixir | ||
| tensor = Nx.tensor([[1, 2], [3, 4]]) | ||
| Nx.broadcast(1, tensor) | ||
| ``` | ||
|
|
||
| The code broadcasts `1` to the shape of `tensor`. In many operators | ||
| and functions, the broadcast happens automatically: | ||
|
|
||
| ```elixir | ||
| Nx.subtract(tensor, 1) | ||
| ``` | ||
|
|
||
| This result is possible because Nx broadcasts _both tensors_ | ||
| in `subtract/2` to compatible shapes. That means you can provide | ||
| scalar values as either argument: | ||
|
|
||
| ```elixir | ||
| Nx.subtract(10, tensor) | ||
| ``` | ||
|
|
||
| Or subtract a row or column. Mathematically, it would look like this: | ||
|
|
||
| $$ | ||
| \begin{bmatrix} | ||
| 1 & 2 \\\\ | ||
| 3 & 4 | ||
| \end{bmatrix} - | ||
| \begin{bmatrix} | ||
| 1 & 2 | ||
| \end{bmatrix} = | ||
| \begin{bmatrix} | ||
| 0 & 0 \\\\ | ||
| 2 & 2 | ||
| \end{bmatrix} | ||
| $$ | ||
|
|
||
| which is the same as this: | ||
|
|
||
| $$ | ||
| \begin{bmatrix} | ||
| 1 & 2 \\\\ | ||
| 3 & 4 | ||
| \end{bmatrix} - | ||
| \begin{bmatrix} | ||
| 1 & 2 \\\\ | ||
| 1 & 2 | ||
| \end{bmatrix} = | ||
| \begin{bmatrix} | ||
| 0 & 0 \\\\ | ||
| 2 & 2 | ||
| \end{bmatrix} | ||
| $$ | ||
|
|
||
| This rewrite happens in Nx as well, through a broadcast operation. We want to | ||
| broadcast the tensor `[1, 2]` to match the `{2, 2}` shape: | ||
|
|
||
| ```elixir | ||
| Nx.broadcast(Nx.tensor([1, 2]), {2, 2}) | ||
| ``` | ||
|
|
||
| The `subtract` function in `Nx` takes care of that broadcast | ||
| implicitly, as discussed above: | ||
|
|
||
| ```elixir | ||
| Nx.subtract(tensor, Nx.tensor([1, 2])) | ||
| ``` | ||
|
|
||
| The broadcast worked as expected, copying the `[1, 2]` row | ||
| enough times to fill a `{2, 2}`-shaped tensor. A tensor with a | ||
| dimension of `1` will broadcast to fill the tensor: | ||
|
|
||
| ```elixir | ||
| [[1], [2]] |> Nx.tensor() |> Nx.broadcast({1, 2, 2}) | ||
| ``` | ||
|
|
||
| ```elixir | ||
| [[[1, 2, 3]]] | ||
| |> Nx.tensor() | ||
| |> Nx.broadcast({4, 2, 3}) | ||
| ``` | ||
|
|
||
| Both of these examples copy parts of the tensor enough | ||
| times to fill out the broadcast shape. You can check out the | ||
| Nx broadcasting documentation for more details: | ||
|
|
||
| <!-- livebook:{"disable_formatting":true} --> | ||
|
|
||
| ```elixir | ||
| h Nx.broadcast | ||
| ``` | ||
|
|
||
| Much of the time, you won't have to broadcast yourself. Many of | ||
| the functions and operators Nx supports will do so automatically. | ||
|
|
||
| We can use tensor-aware operators via various `Nx` functions and | ||
| many of them implicitly broadcast tensors. | ||
|
|
||
| Throughout this section, we have been invoking `Nx.subtract/2` and | ||
| our code would be more expressive if we could use its equivalent | ||
| mathematical operator. Fortunately, Nx provides a way. Next, we'll | ||
| dive into numerical definitions using `defn`. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change | ||||
|---|---|---|---|---|---|---|
| @@ -0,0 +1,150 @@ | ||||||
| # Installation | ||||||
|
|
||||||
| The only prerequisite for installing Nx is Elixir itself. If you don't have Elixir installed | ||||||
| in your machine you can visit this [installation page](https://elixir-lang.org/install.html). | ||||||
|
|
||||||
| There are several ways to install Nx (Numerical Elixir), depending on your project type and needs. | ||||||
|
|
||||||
| ## Using Mix in a standardElixir Project | ||||||
|
|
||||||
| If you are working inside a Mix project, the recommended way to install Nx is by adding it to your mix.exs dependencies: | ||||||
|
|
||||||
| 1. Open mix.exs and modify the deps function: | ||||||
|
|
||||||
| ```elixir | ||||||
| defp deps do | ||||||
| [ | ||||||
| {:nx, "~> 0.9"} # Install the latest stable version | ||||||
| ] | ||||||
| end | ||||||
| ``` | ||||||
|
|
||||||
| 2. Fetch the dependencies, run on the terminal: | ||||||
|
|
||||||
| ```sh | ||||||
| mix deps.get | ||||||
| ``` | ||||||
|
|
||||||
| ## Installing Nx from GitHub (Latest Development Version) | ||||||
|
|
||||||
| If you need the latest, unreleased features, install Nx directly from the GitHub repository. | ||||||
|
|
||||||
| 1. Modify `mix.exs`: | ||||||
|
|
||||||
| ```elixir | ||||||
| defp deps do | ||||||
| [ | ||||||
| {:nx, github: "elixir-nx/nx", branch: "main", sparse: "nx"} | ||||||
| ] | ||||||
| end | ||||||
| ``` | ||||||
|
|
||||||
| 2. Fetch dependencies: | ||||||
|
|
||||||
| ```sh | ||||||
| mix deps.get | ||||||
| ``` | ||||||
|
|
||||||
| ## Installing Nx in a Standalone Script (Without a Mix Project) | ||||||
|
|
||||||
| If you don’t have a Mix project and just want to run a standalone script, use Mix.install/1 to dynamically fetch and install Nx. | ||||||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
|
||||||
|
|
||||||
| ```elixir | ||||||
| Mix.install([:nx]) | ||||||
|
|
||||||
| require Nx | ||||||
|
|
||||||
| tensor = Nx.tensor([1, 2, 3]) | ||||||
| IO.inspect(tensor) | ||||||
| ``` | ||||||
|
|
||||||
| Run the script with: | ||||||
|
|
||||||
| ```sh | ||||||
| elixir my_script.exs | ||||||
| ``` | ||||||
|
|
||||||
| Best for: Quick experiments, small scripts, or one-off computations. | ||||||
|
|
||||||
| ## Installing the Latest Nx from GitHub in a Standalone Script | ||||||
|
|
||||||
| To use the latest development version in a script (without a Mix project): | ||||||
|
|
||||||
| ```elixir | ||||||
| Mix.install([ | ||||||
| {:nx, github: "elixir-nx/nx", branch: "main", sparse: "nx"} | ||||||
| ]) | ||||||
|
|
||||||
| require Nx | ||||||
|
|
||||||
| tensor = Nx.tensor([1, 2, 3]) | ||||||
| IO.inspect(tensor) | ||||||
| ``` | ||||||
|
|
||||||
| Run: | ||||||
|
|
||||||
| ```sh | ||||||
| elixir my_script.exs | ||||||
| ``` | ||||||
|
|
||||||
| Best for: Trying new features from Nx without creating a full project. | ||||||
|
|
||||||
| ## Installing Nx with EXLA for GPU Acceleration | ||||||
|
|
||||||
| To enable GPU/TPU acceleration with Google’s XLA backend, install Nx along with EXLA: | ||||||
|
|
||||||
| 1. Modify mix.exs: | ||||||
|
|
||||||
| ```elixir | ||||||
| defp deps do | ||||||
| [ | ||||||
| {:nx, "~> 0.9"}, | ||||||
| {:exla, "~> 0.9"} # EXLA (Google XLA Backend) | ||||||
| ] | ||||||
| end | ||||||
| ``` | ||||||
|
|
||||||
| 2. Fetch dependencies: | ||||||
|
|
||||||
| ```sh | ||||||
| mix deps.get | ||||||
| ``` | ||||||
|
|
||||||
| 3. Run with EXLA enabled: | ||||||
|
|
||||||
| ```elixir | ||||||
| Nx.default_backend(EXLA.Backend) | ||||||
| Nx.Defn.default_options(compiler: EXLA) | ||||||
| ``` | ||||||
|
|
||||||
| Best for: Running Nx on GPUs or TPUs using Google’s XLA compiler. | ||||||
|
|
||||||
| ## Installing Nx with Torchx for PyTorch Acceleration | ||||||
|
|
||||||
| To run Nx operations on PyTorch’s backend (LibTorch): | ||||||
|
|
||||||
| 1. Modify mix.exs: | ||||||
|
|
||||||
| ```elixir | ||||||
| defp deps do | ||||||
| [ | ||||||
| {:nx, "~> 0.9"}, | ||||||
| {:torchx, "~> 0.9"} # PyTorch Backend | ||||||
| ] | ||||||
| end | ||||||
|
|
||||||
| ``` | ||||||
|
|
||||||
| 2. Fetch dependencies: | ||||||
|
|
||||||
| ```sh | ||||||
| mix deps.get | ||||||
| ``` | ||||||
|
|
||||||
| 3. Run with Torchx enabled: | ||||||
|
|
||||||
| ```elixir | ||||||
| Nx.default_backend(Torchx.Backend) | ||||||
| ``` | ||||||
|
|
||||||
| Best for: Deep learning applications with PyTorch acceleration. | ||||||
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.