Commit 5b2ccad
authored
Fix ZeRO stage to choose BF16 optimizer in test (deepspeedai#7803)
Use ZeRO stage 1 to use BF16 optimizer.
(We should have switched to ZeRO1 in deepspeedai#7788, but I missed the change.
@sfc-gh-truwase)
- deepspeedai#7790 removed the fallback that allowed bf16 model + fp32 grad
accumulation without ZeRO, so that combo now raises NotImplementedError.
- deepspeedai#7788 changed test_bf16_optimizer_fragments to force BF16_Optimizer by
setting grad_accum_dtype=fp32, but it kept ZeRO stage 0, which is now
invalid after deepspeedai#7790.
Signed-off-by: Masahiro Tanaka <mtanaka@anyscale.com>1 parent 15ad92b commit 5b2ccad
1 file changed
+1
-1
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
179 | 179 | | |
180 | 180 | | |
181 | 181 | | |
182 | | - | |
| 182 | + | |
183 | 183 | | |
184 | 184 | | |
185 | 185 | | |
| |||
0 commit comments