DDP Sharded with lightning CLI set up does not work: lightning.fabric.utilities.exceptions.MisconfigurationException: Found invalid type for plugin ddp_sharded. Expected one of: Precision, CheckpointIO, ClusterEnviroment, or LayerSync.
#19127
Replies: 1 comment 4 replies
-
I don't know much about the plugins. But from what I can see in the code, the |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I am trying to run my model in a distributed way, but setting the flag
plugins: ddp_sharded
is failing with the following traceback:I am using a CLI setup, meaning have a main.py as follows:
and a config.yaml where I am setting all the pytorch flags. In the config.yaml I have added the plugins: ddp_sharded flag.
Anyone has any thoughts on what is happening here? I am using lightning version 2.1.1.
Beta Was this translation helpful? Give feedback.
All reactions