Skip to content

Commit 4979bf2

Browse files
saberkunallenwang28
authored andcommitted
Port multi host gpu training instructions.
PiperOrigin-RevId: 303779613
1 parent 4129eb5 commit 4979bf2

File tree

1 file changed

+16
-2
lines changed
  • official/vision/image_classification

1 file changed

+16
-2
lines changed

official/vision/image_classification/README.md

Lines changed: 16 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,11 +29,25 @@ provide a few options.
2929
Note: These models will **not** work with TPUs on Colab.
3030

3131
You can train image classification models on Cloud TPUs using
32-
`tf.distribute.TPUStrategy`. If you are not familiar with Cloud TPUs, it is
33-
strongly recommended that you go through the
32+
[tf.distribute.experimental.TPUStrategy](https://www.tensorflow.org/api_docs/python/tf/distribute/experimental/TPUStrategy?version=nightly).
33+
If you are not familiar with Cloud TPUs, it is strongly recommended that you go
34+
through the
3435
[quickstart](https://cloud.google.com/tpu/docs/quickstart) to learn how to
3536
create a TPU and GCE VM.
3637

38+
### Running on multiple GPU hosts
39+
40+
You can also train these models on multiple hosts, each with GPUs, using
41+
[tf.distribute.experimental.MultiWorkerMirroredStrategy](https://www.tensorflow.org/api_docs/python/tf/distribute/experimental/MultiWorkerMirroredStrategy).
42+
43+
The easiest way to run multi-host benchmarks is to set the
44+
[`TF_CONFIG`](https://www.tensorflow.org/guide/distributed_training#TF_CONFIG)
45+
appropriately at each host. e.g., to run using `MultiWorkerMirroredStrategy` on
46+
2 hosts, the `cluster` in `TF_CONFIG` should have 2 `host:port` entries, and
47+
host `i` should have the `task` in `TF_CONFIG` set to `{"type": "worker",
48+
"index": i}`. `MultiWorkerMirroredStrategy` will automatically use all the
49+
available GPUs at each host.
50+
3751
## MNIST
3852

3953
To download the data and run the MNIST sample model locally for the first time,

0 commit comments

Comments
 (0)