Skip to content

handle finetuning errors gracefully #8194

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
May 12, 2025
Merged

Conversation

arnavsinghvi11
Copy link
Collaborator

small change to ensure if finetuning in BootstrapFinetune fails for whatever reason, the error is raised instead of being saved and returned as the optimized program.

In current behavior, running this

optimizer = dspy.BootstrapFinetune(num_threads=16)  # if you *do* have labels, pass metric=your_metric here!
classify_ft = optimizer.compile(student_classify, teacher=teacher_classify, trainset=unlabeled_trainset)

classify_ft.get_lm().launch()

shows the error during compilation (ERROR dspy.clients.lm: Failed to ....) but doesn't actually exit and sets the student program's lm to be the error

so classify_ft.get_lm() is the error trace and launching is when the exception is raised.

Instead, the error should be raised earlier and shouldn't be set to the returned optimized program.

@arnavsinghvi11 arnavsinghvi11 requested a review from dilarasoylu May 7, 2025 20:13
@arnavsinghvi11 arnavsinghvi11 merged commit 9e019c6 into main May 12, 2025
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants