Skip to content

Commit ca020b2

Browse files
authored
Merge pull request #52 from randyzwitch/patch-1
Update README.md
2 parents a592677 + 5934d7d commit ca020b2

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ This allows GPUArrays to offer a lot of functionality with minimal code.
3535

3636
Also, when compiling Julia for the GPU, we can use all the cool features from Julia, e.g.
3737
higher order functions, multiple dispatch, meta programming and generated functions.
38-
Checkout the examples, to see how this can be used to emit specialized code while not loosing flexibility:
38+
Checkout the examples, to see how this can be used to emit specialized code while not losing flexibility:
3939
[unrolling](https://github.com/JuliaGPU/GPUArrays.jl/blob/master/examples/juliaset.jl),
4040
[vector loads/stores](https://github.com/JuliaGPU/GPUArrays.jl/blob/master/examples/vectorload.jl)
4141

@@ -44,7 +44,7 @@ In theory, we could go as far as inspecting user defined callbacks (we can get t
4444

4545
### Automatic Differentiation
4646

47-
Because of neuronal netorks, automatic differentiation is super hyped right now!
47+
Because of neural networks, automatic differentiation is super hyped right now!
4848
Julia offers a couple of packages for that, e.g. [ReverseDiff](https://github.com/JuliaDiff/ReverseDiff.jl).
4949
It heavily relies on Julia's strength to specialize generic code and dispatch to different implementations depending on the Array type, allowing an almost overheadless automatic differentiation.
5050
Making this work with GPUArrays will be a bit more involved, but the

0 commit comments

Comments
 (0)