Skip to content

Conversation

@Chesars
Copy link
Contributor

@Chesars Chesars commented Nov 19, 2025

Summary

PR #16766 changed mode: "chat"mode: "responses" for multiple models (gpt-5, o1, o3, etc.). This causes severe issues:

1. Breaks old LiteLLM versions

The model_prices_and_context_window.json file is fetched dynamically from GitHub, so this change affects all LiteLLM versions, not just new ones. Users running older versions experienced unexpected breakage without updating their library.

2. Incomplete bridge implementation

The completion → responses bridge is missing response_formattext.format conversion, causing:

3. Forces unnecessary API migration

Users who don't need reasoning_content are forced to use the responses API, which has different limitations.

What this PR does

Reverts the mode field back to "chat" for models that support both endpoints:

  • gpt-5, gpt-5-mini, gpt-5-nano
  • o1, o1-mini, o1-preview
  • o3, o3-mini, o4-mini
  • etc.

Keeps mode: "responses" only for models that require it:

  • o1-pro, o3-pro (only work with /v1/responses)
  • gpt-5-codex, gpt-5.1-codex
  • deep-research variants

No breaking changes

The supported_endpoints field already documents which endpoints each model supports. Models can work with chat/completions without losing functionality - users who want reasoning can explicitly use litellm.responses().

Next Steps

  1. Implement proper response_formattext.format conversion in the bridge
  2. Add intelligent endpoint selection based on request parameters
  3. Address the architectural issue of mixing operation type with endpoint selection (#TBD)

Related Issues

@vercel
Copy link

vercel bot commented Nov 19, 2025

@Chesars is attempting to deploy a commit to the CLERKIEAI Team on Vercel.

A member of the Team first needs to authorize it.

@Chesars
Copy link
Contributor Author

Chesars commented Nov 19, 2025

Closing as revert was made in #16849

@Chesars Chesars closed this Nov 19, 2025
@Chesars Chesars deleted the revert-16766-litellm_gpt_5_responses branch November 19, 2025 21:55
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Bug]: Sudden switch to responses from completions causing failures

1 participant