docs: add a Getting Started Section#1577
Conversation
| @@ -0,0 +1,82 @@ | |||
| # What is Nx? | |||
There was a problem hiding this comment.
@josevalim This is the start of our revamped docs.
We're taking the current getting started guide and both splitting it and getting into more detail.
I'm thinking we should merge these onto a new-docs branch and only merge that onto main after the getting started is fully done.
WDYT?
There was a problem hiding this comment.
I think it is fine to push to main directly, given the plan is for continuous work on it, right?
There was a problem hiding this comment.
We'll make it so that this PR is merged when it fully replaces the previous introduction to nx guide.
|
|
||
| There are several ways to install Nx (Numerical Elixir), depending on your project type and needs. | ||
|
|
||
| ## Using Mix in a standardElixir Project |
There was a problem hiding this comment.
| ## Using Mix in a standardElixir Project | |
| ## Using Nx in a Standard Elixir Project |
|
|
||
| ```sh | ||
| mix deps.get | ||
|
|
| ```elixir | ||
| defp deps do | ||
| [ | ||
| {:nx, github: "elixir-nx/nx", branch: "main"} |
There was a problem hiding this comment.
| {:nx, github: "elixir-nx/nx", branch: "main"} | |
| {:nx, github: "elixir-nx/nx", branch: "main", sparse: "nx"} |
| IO.inspect(tensor) | ||
|
|
There was a problem hiding this comment.
| IO.inspect(tensor) | |
| IO.inspect(tensor) |
|
|
||
| ## Installing Nx in a Standalone Script (Without a Mix Project) | ||
|
|
||
| If you don’t have a Mix project and just want to run a standalone script, use Mix.install/1 to dynamically fetch and install Nx. |
There was a problem hiding this comment.
| If you don’t have a Mix project and just want to run a standalone script, use Mix.install/1 to dynamically fetch and install Nx. | |
| If you don’t have a Mix project and just want to run a standalone script, use `Mix.install/1` to dynamically fetch and install Nx. |
|
|
||
| ```sh | ||
| elixir my_script.exs | ||
|
|
|
|
||
| ```elixir | ||
| Mix.install([ | ||
| {:nx, github: "elixir-nx/nx", branch: "main"} |
There was a problem hiding this comment.
| {:nx, github: "elixir-nx/nx", branch: "main"} | |
| {:nx, github: "elixir-nx/nx", branch: "main", sparse: "nx"} |
|
|
||
| Best for: Running Nx on GPUs or TPUs using Google’s XLA compiler. | ||
|
|
||
| ## Installing Nx with Torchx for PyTorch Acceleration | ||
|
|
||
| To run Nx operations on PyTorch’s backend (LibTorch): | ||
|
|
||
| 1. Modify mix.exs: | ||
|
|
||
| ```elixir | ||
| defp deps do | ||
| [ | ||
| {:nx, "~> 0.5"}, | ||
| {:torchx, "~> 0.5"} # PyTorch Backend | ||
| ] | ||
| end | ||
|
|
||
| ``` |
There was a problem hiding this comment.
| Best for: Running Nx on GPUs or TPUs using Google’s XLA compiler. | |
| ## Installing Nx with Torchx for PyTorch Acceleration | |
| To run Nx operations on PyTorch’s backend (LibTorch): | |
| 1. Modify mix.exs: | |
| ```elixir | |
| defp deps do | |
| [ | |
| {:nx, "~> 0.5"}, | |
| {:torchx, "~> 0.5"} # PyTorch Backend | |
| ] | |
| end | |
| ``` |
Let's not mention Torchx. We could maybe reference EMLX, but it's not even released yet, so let's leave this for later.
|
|
||
| 2. Fetch dependencies: | ||
|
|
||
| ```sh | ||
| mix deps.get | ||
| ``` | ||
|
|
||
| 3. Run with EXLA enabled: | ||
|
|
||
| ```elixir | ||
| Torchx.set_preferred_backend() | ||
| ``` | ||
|
|
||
| Best for: Deep learning applications with PyTorch acceleration. | ||
|
|
||
| ## Installing Nx with OpenBLAS for CPU Optimization | ||
|
|
||
| To optimize CPU performance with OpenBLAS: | ||
|
|
||
| 1. Install OpenBLAS (libopenblas): | ||
| - Ubuntu/Debian: | ||
| ```sh | ||
| sudo apt install libopenblas-dev | ||
| ``` | ||
| - MacOS (using Homebrew): | ||
| ```sh | ||
| brew install openblas | ||
| ``` | ||
| 2. Modify mix.exs: | ||
|
|
||
| ```elixir | ||
| defp deps do | ||
| [ | ||
| {:nx, "~> 0.5"}, | ||
| {:openblas, "~> 0.5"} # CPU-optimized BLAS backend | ||
| ] | ||
| end | ||
| ``` | ||
|
|
||
| 3. Fetch dependencies: | ||
|
|
||
| ```sh | ||
| mix deps.get | ||
| ``` | ||
|
|
||
| Best for: Optimizing CPU-based tensor computations. |
There was a problem hiding this comment.
| 2. Fetch dependencies: | |
| ```sh | |
| mix deps.get | |
| ``` | |
| 3. Run with EXLA enabled: | |
| ```elixir | |
| Torchx.set_preferred_backend() | |
| ``` | |
| Best for: Deep learning applications with PyTorch acceleration. | |
| ## Installing Nx with OpenBLAS for CPU Optimization | |
| To optimize CPU performance with OpenBLAS: | |
| 1. Install OpenBLAS (libopenblas): | |
| - Ubuntu/Debian: | |
| ```sh | |
| sudo apt install libopenblas-dev | |
| ``` | |
| - MacOS (using Homebrew): | |
| ```sh | |
| brew install openblas | |
| ``` | |
| 2. Modify mix.exs: | |
| ```elixir | |
| defp deps do | |
| [ | |
| {:nx, "~> 0.5"}, | |
| {:openblas, "~> 0.5"} # CPU-optimized BLAS backend | |
| ] | |
| end | |
| ``` | |
| 3. Fetch dependencies: | |
| ```sh | |
| mix deps.get | |
| ``` | |
| Best for: Optimizing CPU-based tensor computations. |
| deftransform compute_tensor_from_list(list) do | ||
| tensor = Nx.tensor(list) | ||
| double_tensor(tensor) | ||
| end |
There was a problem hiding this comment.
Let's add an example that does shape validation and manipulation, such as adding a new axis and changing things around
d006d3d to
e58062b
Compare
Co-authored-by: José Valim <jose.valim@dashbit.co>
…ir-nx#1582) Co-authored-by: Paulo Valente <16843419+polvalente@users.noreply.github.com>
Co-authored-by: Paulo Valente <16843419+polvalente@users.noreply.github.com>
Co-authored-by: José Valim <jose.valim@dashbit.co>
e62e06d to
7905414
Compare
polvalente
left a comment
There was a problem hiding this comment.
Congrats! The new documentation covers quite a bit more surface area, and will probably allow for more growth in the future!
This pull request looks to enhances the documentation by introducing a "Getting Started" section to help new users quickly understand and begin working with the Nx library in Elixir.
Key Additions: