Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

TypeError: DeepSpeedZeroOptimizer is not an Optimizer #6992

Open
Tengxf opened this issue Feb 3, 2025 · 1 comment
Open

TypeError: DeepSpeedZeroOptimizer is not an Optimizer #6992

Tengxf opened this issue Feb 3, 2025 · 1 comment

Comments

@Tengxf
Copy link

Tengxf commented Feb 3, 2025

Hello, I encountered an issue while integrating DeepSpeed. My code is written as follows:
`deepspeed_config_path = './deepspeed.json'
args = parse_arguments()

self.model, self.optimizer, _, _ = deepspeed.initialize(
args=args,
model=self.model,
training_data=trainset, # 传递训练数据
config=deepspeed_config_path
)and the content of my JSON file is as follows:{
"train_batch_size": 4,
"gradient_accumulation_steps": 1,
"optimizer": {
"type": "AdamW",
"params": {
"lr": 5.0e-4,
"weight_decay": 1.0e-5
}
},
"_comment": "scheduler定义的是学习率调度器.",
"scheduler": {
"type": "CosineAnnealingLR",
"params": {
"T_max": 10,
"eta_min": 5e-6
}
},
"fp16": {
"enabled": true
},
"_comment2": "选择zero2策略.",
"zero_optimization": {
"stage": 2
},
"gradient_clipping": 1.0,
"wall_clock_breakdown": true
}
`
However, when I run it, an error occurs. Could you please advise me on how to resolve this issue?
Thank you very much for your help!!!

@loadams
Copy link
Collaborator

loadams commented Feb 3, 2025

Can you share a full repro script please?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants