Skip to content

Commit f79ca95

Browse files
authored
Update README.md
Additional improvements to code indentation
1 parent a47be19 commit f79ca95

File tree

1 file changed

+13
-13
lines changed

1 file changed

+13
-13
lines changed

README.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -34,11 +34,11 @@ Common kinds of arrays can be constructed with functions beginning with
3434
`d`:
3535

3636
```julia
37-
dzeros(100,100,10)
38-
dones(100,100,10)
39-
drand(100,100,10)
40-
drandn(100,100,10)
41-
dfill(x,100,100,10)
37+
dzeros(100,100,10)
38+
dones(100,100,10)
39+
drand(100,100,10)
40+
drandn(100,100,10)
41+
dfill(x,100,100,10)
4242
```
4343

4444
In the last case, each element will be initialized to the specified
@@ -47,7 +47,7 @@ For more control, you can specify which processes to use, and how the
4747
data should be distributed:
4848

4949
```julia
50-
dzeros((100,100), workers()[1:4], [1,4])
50+
dzeros((100,100), workers()[1:4], [1,4])
5151
```
5252

5353
The second argument specifies that the array should be created on the first
@@ -89,7 +89,7 @@ Constructing Distributed Arrays
8989
The primitive `DArray` constructor has the following somewhat elaborate signature:
9090

9191
```julia
92-
DArray(init, dims[, procs, dist])
92+
DArray(init, dims[, procs, dist])
9393
```
9494

9595
`init` is a function that accepts a tuple of index ranges. This function should
@@ -106,7 +106,7 @@ As an example, here is how to turn the local array constructor `fill`
106106
into a distributed array constructor:
107107

108108
```julia
109-
dfill(v, args...) = DArray(I->fill(v, map(length,I)), args...)
109+
dfill(v, args...) = DArray(I->fill(v, map(length,I)), args...)
110110
```
111111

112112
In this case the `init` function only needs to call `fill` with the
@@ -268,7 +268,7 @@ SPMD, i.e., a Single Program Multiple Data mode is implemented by submodule `Dis
268268

269269
The same block of code is executed concurrently on all workers using the `spmd` function.
270270

271-
```julia
271+
```
272272
# define foo() on all workers
273273
@everywhere function foo(arg1, arg2)
274274
....
@@ -314,7 +314,7 @@ Example
314314

315315
This toy example exchanges data with each of its neighbors `n` times.
316316

317-
```julia
317+
```
318318
using Distributed
319319
using DistributedArrays
320320
addprocs(8)
@@ -383,7 +383,7 @@ Nested `spmd` calls
383383
As `spmd` executes the the specified function on all participating nodes, we need to be careful with nesting `spmd` calls.
384384

385385
An example of an unsafe(wrong) way:
386-
```julia
386+
```
387387
function foo(.....)
388388
......
389389
spmd(bar, ......)
@@ -401,7 +401,7 @@ spmd(foo,....)
401401
In the above example, `foo`, `bar` and `baz` are all functions wishing to leverage distributed computation. However, they themselves may be currenty part of a `spmd` call. A safe way to handle such a scenario is to only drive parallel computation from the master process.
402402

403403
The correct way (only have the driver process initiate `spmd` calls):
404-
```julia
404+
```
405405
function foo()
406406
......
407407
myid()==1 && spmd(bar, ......)
@@ -418,7 +418,7 @@ spmd(foo,....)
418418
```
419419

420420
This is also true of functions which automatically distribute computation on DArrays.
421-
```julia
421+
```
422422
function foo(d::DArray)
423423
......
424424
myid()==1 && map!(bar, d)

0 commit comments

Comments
 (0)