Skip to content
This repository was archived by the owner on Jun 3, 2025. It is now read-only.

Commit 77166e4

Browse files
authored
propagate distillation teacher for fp32 transformers recipes (#637) (#638)
1 parent fc208cb commit 77166e4

File tree

1 file changed

+4
-1
lines changed
  • src/sparseml/transformers/sparsification

1 file changed

+4
-1
lines changed

src/sparseml/transformers/sparsification/trainer.py

Lines changed: 4 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -231,7 +231,10 @@ def create_optimizer(self):
231231
self.manager,
232232
steps_per_epoch=self.manager_steps_per_epoch,
233233
loggers=self.manager_loggers,
234-
initialize_kwargs={"grad_sampler": self.grad_sampler},
234+
initialize_kwargs={
235+
"grad_sampler": self.grad_sampler,
236+
"distillation_teacher": self.teacher,
237+
},
235238
)
236239
if not self.manager.initialized:
237240
self.manager.initialize(

0 commit comments

Comments
 (0)