Does DeepSpeed of Pytorch-Lightning support BFloat16 now? #12481
Unanswered
ShaneTian
asked this question in
DDP / multi-GPU / multi-node
Replies: 1 comment
-
will be added: |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
How to configure BFloat16 in DeepSpeed of Pytorch-Lightning?
Trainer(..., precision="bf16", ...)
One more thing:
I did not find anything about
bf16
in deepspeed, is it not supported now?https://github.com/PyTorchLightning/pytorch-lightning/blob/2e5728a4841cb1d3b75123ac941cd36f681f9a11/pytorch_lightning/strategies/deepspeed.py#L652-L667
Beta Was this translation helpful? Give feedback.
All reactions