fix: Support response_format parameter in completion -> responses bridge #16844
+198
−0
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Title
fix: Support response_format parameter in completion → responses bridge
Relevant issues
Fixes #16810
Pre-Submission checklist
Please complete all items before asking a LiteLLM maintainer to review your PR
tests/litellm/directory, Adding at least 1 test is a hardrequirement - see details
make test-unitType
🐛 Bug Fix
Changes
Summary
When using
completion()with models that havemode: "responses"(likeo3-pro,gpt-5-codex), theresponse_formatparameter with JSON schemas was being incorrectly handled, so :"Invalid 'metadata.schema_dict_json': string too long. Expected maximum 512, but got 1203"Details
The
completion→responsesbridge inlitellm/completion_extras/litellm_responses_transformation/transformation.pywas missing the conversion of:response_format(Chat Completion API format)text.format(Responses API format)The inverse bridge (
responses→completion) already had this conversion implemented in commit 29f0ed2, but thecompletion→responsesdirection was incomplete.Solution
Added
_transform_response_format_to_text_format()method that converts:response_formatwithjson_schema→text.formatwithjson_schemaresponse_formatwithjson_object→text.formatwithjson_objectresponse_formatwithtext→text.formatwithtextUpdated
transform_request()to convertresponse_formatparameter before sending tolitellm.responses().Files Changed
Modified:
litellm/completion_extras/litellm_responses_transformation/transformation.py_transform_response_format_to_text_format()method (lines 592-647)transform_request()to handleresponse_format(lines 199-203)Added:
tests/test_litellm/completion_extras/test_litellm_responses_transformation_transformation.pyTesting ✅