Skip to content

Commit 97db3fc

Browse files
authored
[Distributed] Upgrade the SOK version to v4.2. (#551)
1 parent fa8553c commit 97db3fc

File tree

4 files changed

+8
-4
lines changed

4 files changed

+8
-4
lines changed

addons/sparse_operation_kit/README.md

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -77,14 +77,18 @@ There are 2 additional steps to use SOK embedding lookup with `Embedding Variabl
7777
git clone --recursive https://github.com/alibaba/DeepRec.git /DeepRec
7878
```
7979
2. Build DeepRec from source code. You can follow the instruction in [DeepRec Build](https://deeprec.readthedocs.io/zh/latest/DeepRec-Compile-And-Install.html).
80+
3. Enter HugeCTR and do initialization
81+
```bash
82+
cd /DeepRec/addons/sparse_operation_kit/hugectr/ && git submodule update --init --recursive && cd sparse_operation_kit && mkdir build && cd build
83+
```
8084
3. Build and install SOK. You need to specify ${DeepRecBuild}, which is the directory that you store the DeepRec building intermidia results.
8185
```
8286
DeepRecBuild=${DeepRecBuild} cmake -DENABLE_DEEPREC=ON -DSM=75 .. && make -j && make install;
8387
export PYTHONPATH=/DeepRec/addons/sparse_operation_kit/hugectr/sparse_operation_kit
8488
```
8589
4. Run utest.
8690
```
87-
cd /DeepRec/addons/sparse_operation_kit/adapter && horovodrun -np ${NUM_GPU} -H localhost:${NUM_GPU} python3 embedding_var_lookup_utest.py
91+
cd /DeepRec/addons/sparse_operation_kit/python && horovodrun -np ${NUM_GPU} -H localhost:${NUM_GPU} python3 embedding_var_lookup_utest.py
8892
```
8993
## Benchmark
9094
1. Download Kaggle Display Advertising Challenge Dataset (Criteo Dataset) from https://storage.googleapis.com/dataset-uploader/criteo-kaggle/large_version/train.csv

addons/sparse_operation_kit/example/demo.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -23,7 +23,7 @@
2323
values=tf.convert_to_tensor([1, 1, 3, 4, 5], dtype=tf.int64),
2424
dense_shape=[2, 3]
2525
)
26-
emb = sok.lookup_sparse([var], [indices], hotness=[3], combiners=['sum'])
26+
emb = sok.lookup_sparse([var], [indices], combiners=['sum'])
2727
fun = tf.multiply(emb, 2.0, name='multiply')
2828
loss = tf.reduce_sum(fun, name='reduce_sum')
2929
opt = tf.train.AdagradOptimizer(0.1)
Submodule hugectr updated 219 files

addons/sparse_operation_kit/python/embedding_var_lookup_utest.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -81,7 +81,7 @@
8181
optimizer = tf.train.AdagradOptimizer(0.1)
8282

8383
def step(params):
84-
embeddings = sok.lookup_sparse(params, indices, hotness, combiners)
84+
embeddings = sok.lookup_sparse(params, indices, combiners)
8585
loss = 0
8686
for i in range(len(embeddings)):
8787
loss = loss + tf.reduce_sum(embeddings[i])

0 commit comments

Comments
 (0)