-
Notifications
You must be signed in to change notification settings - Fork 217
docs: add a Getting Started Section #1577
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Changes from 3 commits
Commits
Show all changes
29 commits
Select commit
Hold shift + click to select a range
20c9ac0
docs: add a Getting Started Section
TomasPegado ba529c6
Update introduction.md
polvalente 0a97e25
Update quickstart.livemd
polvalente 22aab8f
docs: add installation guide
TomasPegado ffe8caf
Fixed to_pointer/2 docs :kind -> :mode (#1578)
pejrich f252073
Refactor EXLA NIFs to use Fine (#1581)
jonatanklosko de77fd0
Change Nx.to_pointer/2 and Nx.from_pointer/5 to raise on errors (#1582)
jonatanklosko 8955ff2
Lu matrix decomposition (#1587)
TomasPegado 6f3b58e
docs: autodiff docs (#1580)
polvalente dacb96f
chore: hide Nx.LinAlg.LU module
polvalente b192288
Hide symbols from the NIF shared library (#1589)
jonatanklosko 162d60b
feat(exla): take advantage of the new LU impl (#1590)
polvalente d5518fd
feat: allow explicitly disabling CUDA step (#1588)
polvalente 6cc84d1
fix(exla): batched eigh (#1591)
polvalente 6c33108
fix(exla): respect device id when automatic transfers are disabled (#…
polvalente b397939
test(exla): add more tests for LinAlg functions (#1594)
polvalente 16f07f9
fix(exla): vectorized gather (#1595)
polvalente a48f578
feat: Nx.Defn.Graph (#1544)
polvalente a57c065
Fix(exla): triangular_solve with batched matrix input (#1596)
TomasPegado f4cedfd
Clarify composite docs
josevalim 5e8abee
docs: getting started section
TomasPegado 2eef046
docs: removing intro-to-nx.livemd
TomasPegado 2129801
docs: removing intro-to-nx.livemd references
TomasPegado 7905414
docs: improving deftransform example
TomasPegado 2a7df18
Merge branch 'elixir-nx:main' into docs_refact
TomasPegado f4c2e66
chore: minor doc changes
polvalente 28fb432
chore: more changes due to code review
polvalente fb879b5
chore: even more changes due to code review
polvalente b8f794e
chore: final set of more changes due to code review
polvalente File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,82 @@ | ||
| # What is Nx? | ||
|
|
||
| Nx is the numerical computing library of Elixir. Since Elixir's primary numerical datatypes and structures are not optimized for numerical programming, Nx is the fundamental package built to bridge this gap. | ||
|
|
||
| [Elixir Nx](https://github.com/elixir-nx/nx) smoothly integrates typed, multidimensional data called [tensors](introduction.html#what-are-tensors)). | ||
| Nx has four primary capabilities: | ||
|
|
||
| - In Nx, tensors hold typed data in multiple, optionally named dimensions. | ||
| - Numerical definitions, known as `defn`, support custom code with | ||
| tensor-aware operators and functions. | ||
| - [Automatic differentiation](https://arxiv.org/abs/1502.05767), also known as | ||
| autograd or autodiff, supports common computational scenarios | ||
| such as machine learning, simulations, curve fitting, and probabilistic models. | ||
| - Broadcasting, which is a term for element-by-element operations. Most of the Nx operations | ||
| make use of automatic implicit broadcasting. You can see more on broadcasting | ||
| [here.](intro-to-nx.html#broadcasts) | ||
|
|
||
| Nx tensors can hold unsigned integers (u2, u4, u8, u16, u32, u64), | ||
| signed integers (s2, s4, s8, s16, s32, s64), | ||
| floats (f8, f16, f32, f64), brain floats (bf16), and complex (c64, c128). | ||
| Tensors support backends implemented outside of Elixir, such as Google's | ||
| Accelerated Linear Algebra (XLA) and PyTorch. | ||
|
|
||
| Numerical definitions provide compiler support to allow just-in-time compilation | ||
| targetting specialized processors to speed up numeric computation including | ||
| TPUs and GPUs. | ||
|
|
||
| ## What are Tensors? | ||
|
|
||
| In Nx, we express multi-dimensional data using typed tensors. Simply put, | ||
| a tensor is a multi-dimensional array with a predetermined shape and | ||
| type. To interact with them, Nx relies on tensor-aware operators rather | ||
| than `Enum.map/2` and `Enum.reduce/3`. | ||
|
|
||
| It allows us to work with the central theme in numerical computing, systems of equations, | ||
| which are often expressed and solved with multidimensional arrays. | ||
|
|
||
| For example, this is a two dimensional array: | ||
|
|
||
| $$ | ||
| \begin{bmatrix} | ||
| 1 & 2 \\\\ | ||
| 3 & 4 | ||
| \end{bmatrix} | ||
| $$ | ||
|
|
||
| As elixir programmers, we can typically express a similar data structure using a list of lists, | ||
| like this: | ||
|
|
||
| ```elixir | ||
| [ | ||
| [1, 2], | ||
| [3, 4] | ||
| ] | ||
| ``` | ||
|
|
||
| This data structure works fine within many functional programming | ||
| algorithms, but breaks down with deep nesting and random access. | ||
|
|
||
| On top of that, Elixir numeric types lack optimization for many numerical | ||
| applications. They work fine when programs | ||
| need hundreds or even thousands of calculations. However, they tend to break | ||
| down with traditional STEM applications when a typical problem | ||
| needs millions of calculations. | ||
|
|
||
| To solve for this, we can simply use Nx tensors, for example: | ||
|
|
||
| ```elixir | ||
| Nx.tensor([[1,2],[3,4]]) | ||
|
|
||
| Output: | ||
| #Nx.Tensor< | ||
| s32[2][2] | ||
| [ | ||
| [1, 2], | ||
| [3, 4] | ||
| ] | ||
| ``` | ||
|
|
||
| To learn Nx, we'll get to know tensors first. The following overview will touch | ||
| on the major features. The advanced section of the documentation will take a deep dive into working | ||
| with tensors in detail, autodiff, and backends. | ||
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@josevalim This is the start of our revamped docs.
We're taking the current getting started guide and both splitting it and getting into more detail.
I'm thinking we should merge these onto a
new-docsbranch and only merge that onto main after the getting started is fully done.WDYT?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think it is fine to push to main directly, given the plan is for continuous work on it, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We'll make it so that this PR is merged when it fully replaces the previous introduction to nx guide.