Skip to content

Commit daac6a6

Browse files
authored
doc: Rephrase some text referring to DArray (JuliaLang#53244)
Some parts of `distributed-computing.md` were somewhat unclear/confusing after `DArray` was moved to `DistributedArrays.jl`.
1 parent 36b7d3b commit daac6a6

File tree

1 file changed

+17
-12
lines changed

1 file changed

+17
-12
lines changed

doc/src/manual/distributed-computing.md

Lines changed: 17 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -813,16 +813,18 @@ will always operate on copies of arguments.
813813

814814
## [Shared Arrays](@id man-shared-arrays)
815815

816-
Shared Arrays use system shared memory to map the same array across many processes. While there
817-
are some similarities to a [`DArray`](https://github.com/JuliaParallel/DistributedArrays.jl), the
818-
behavior of a [`SharedArray`](@ref) is quite different. In a [`DArray`](https://github.com/JuliaParallel/DistributedArrays.jl),
819-
each process has local access to just a chunk of the data, and no two processes share the same
820-
chunk; in contrast, in a [`SharedArray`](@ref) each "participating" process has access to the
821-
entire array. A [`SharedArray`](@ref) is a good choice when you want to have a large amount of
822-
data jointly accessible to two or more processes on the same machine.
823-
824-
Shared Array support is available via module `SharedArrays` which must be explicitly loaded on
825-
all participating workers.
816+
Shared Arrays use system shared memory to map the same array across many processes. A
817+
[`SharedArray`](@ref) is a good choice when you want to have a large amount of data jointly
818+
accessible to two or more processes on the same machine. Shared Array support is available via the
819+
module `SharedArrays`, which must be explicitly loaded on all participating workers.
820+
821+
A complementary data structure is provided by the external package
822+
[`DistributedArrays.jl`](https://github.com/JuliaParallel/DistributedArrays.jl) in the form of a
823+
`DArray`. While there are some similarities to a [`SharedArray`](@ref), the behavior of a
824+
[`DArray`](https://github.com/JuliaParallel/DistributedArrays.jl) is quite different. In a
825+
[`SharedArray`](@ref), each "participating" process has access to the entire array; in contrast, in
826+
a [`DArray`](https://github.com/JuliaParallel/DistributedArrays.jl), each process has local access
827+
to just a chunk of the data, and no two processes share the same chunk.
826828

827829
[`SharedArray`](@ref) indexing (assignment and accessing values) works just as with regular arrays,
828830
and is efficient because the underlying memory is available to the local process. Therefore,
@@ -1326,8 +1328,11 @@ in future releases.
13261328
## Noteworthy external packages
13271329

13281330
Outside of Julia parallelism there are plenty of external packages that should be mentioned.
1329-
For example [MPI.jl](https://github.com/JuliaParallel/MPI.jl) is a Julia wrapper for the `MPI` protocol, [Dagger.jl](https://github.com/JuliaParallel/Dagger.jl) provides functionality similar to Python's [Dask](https://dask.org/), and
1330-
[DistributedArrays.jl](https://github.com/JuliaParallel/Distributedarrays.jl) provides array operations distributed across workers, as presented in [Shared Arrays](@ref).
1331+
For example, [`MPI.jl`](https://github.com/JuliaParallel/MPI.jl) is a Julia wrapper for the `MPI`
1332+
protocol, [`Dagger.jl`](https://github.com/JuliaParallel/Dagger.jl) provides functionality similar to
1333+
Python's [Dask](https://dask.org/), and
1334+
[`DistributedArrays.jl`](https://github.com/JuliaParallel/Distributedarrays.jl) provides array
1335+
operations distributed across workers, as [outlined above](@ref man-shared-arrays).
13311336

13321337
A mention must be made of Julia's GPU programming ecosystem, which includes:
13331338

0 commit comments

Comments
 (0)