-
Each TPU v3 chip contains two TensorCores, and a TPU v3-8 includes 4 chips, totaling 8 TensorCores. Running However, given that each TPU v4 chip also contains two TensorCores, I would expect a TPU v4-32 setup (theoretically comprising 16 chips) to report 32 TensorCores when |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 3 replies
-
Tensorcores on TPU v4 chips can actually be used in two modes: either each each core works independently, in which case you see 32 devices, or both cores work together, in which case you see 16, equal to the number of chips. If I remember right we default to the latter mode, since it's usually what you want. (This "working together" capability is not available on TPUv3 or earlier.) |
Beta Was this translation helpful? Give feedback.
Tensorcores on TPU v4 chips can actually be used in two modes: either each each core works independently, in which case you see 32 devices, or both cores work together, in which case you see 16, equal to the number of chips. If I remember right we default to the latter mode, since it's usually what you want.
(This "working together" capability is not available on TPUv3 or earlier.)