You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+4-2Lines changed: 4 additions & 2 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -34,8 +34,10 @@ and running the following command:
34
34
```
35
35
make install_conda
36
36
```
37
-
Optionally, if you work with multi-GPU environment and want to have Nvidia's collective communication calls
38
-
[(NCCL)](https://developer.nvidia.com/nccl>) enabled, please visit the [installation guide](https://pylops.github.io/pylops-mpi/installation.html) for further detail
37
+
Optionally, if you work with multi-GPU environment and want to use Nvidia's collective communication calls (NCCL) enabled, install your environment with
38
+
```
39
+
make install_conda_nccl
40
+
```
39
41
40
42
## Run Pylops-MPI
41
43
Once you have installed the prerequisites and pylops-mpi, you can run pylops-mpi using the `mpiexec` command.
Under the hood, PyLops-MPI use both MPI Communicator and NCCL Communicator to manage distributed operations. Each GPU is logically binded to
116
-
one MPI process. Generally speaking, the small operation like array-related shape and size remain using MPI while the collective calls
117
-
like AllReduce will be carried through NCCL.
116
+
one MPI process. In fact, minor communications like those dealing with array-related shapes and sizes are still performed using MPI, while collective calls on array like AllReduce are carried through NCCL
118
117
119
118
.. note::
120
119
121
120
The CuPy and NCCL backend is in active development, with many examples not yet in the docs.
122
121
You can find many `other examples <https://github.com/PyLops/pylops_notebooks/tree/master/developement-mpi/Cupy_MPI>`_ from the `PyLops Notebooks repository <https://github.com/PyLops/pylops_notebooks>`_.
123
122
124
123
Supports for NCCL Backend
125
-
-------------------
126
-
In the following, we provide a list of modules in which operates on :class:`pylops_mpi.DistributedArray`
127
-
that can leverage NCCL backend
124
+
----------------------------
125
+
In the following, we provide a list of modules (i.e., operators and solvers) where we plan to support NCCL and the current status:
0 commit comments