Skip to content

Commit e7c036b

Browse files
committed
refine readme
1 parent a5b7ea0 commit e7c036b

File tree

1 file changed

+2
-2
lines changed

1 file changed

+2
-2
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ Results on test set (please refer to the paper for detailed results and experime
2121

2222
The effect of *k*, selection (top-*k* vs. random), and network dimension (top-*k* vs. *k*-dimensional):
2323

24-
![effect-of-k](./docs/effect-k.png)
24+
![Effect of k](./docs/effect-k.png)
2525

2626
To achieve speedups on GPUs, a slight change is made to unify the top-_k_ pattern across the mini-batch. The original meProp will cause different top-_k_ patterns across examples of a mini-batch, which will require sparse matrix multiplication. However, sparse matrix multiplication is not very efficient on GPUs compared to dense matrix multiplication on GPUs. Hence, by unifying the top-_k_ pattern, we can extract the parts of the matrices that need computation (dense matrices), get the results, and reconstruct them to the appropriate size for further computation. This leads to actual speedups on GPUs, although we believe if a better method is designed, the speedups on GPUs can be better.
2727

@@ -77,7 +77,7 @@ or
7777
```
7878
mono nnmnist.exe <config.json>
7979
```
80-
where <config.json> is a configuration file. There is [an example configuration file](./meprop (CSharp)/nnmnist/default.json) in the source codes. The output will be written to a file at the same location with the executable. The code supports random top-_k_ selection in addition.
80+
where <config.json> is a configuration file. There is [an example configuration file]("./meprop (CSharp)/nnmnist/default.json") in the source codes. The output will be written to a file at the same location with the executable. The code supports random top-_k_ selection in addition.
8181
### PyTorch
8282
```bash
8383
python3.5 meprop (PyTorch).py

0 commit comments

Comments
 (0)