Skip to content

fix(model)#902

Merged
TimeBomb2018 merged 1 commit intorelease/v0.3.0from
fix/Timebomb_030
Apr 15, 2026
Merged

fix(model)#902
TimeBomb2018 merged 1 commit intorelease/v0.3.0from
fix/Timebomb_030

Conversation

@TimeBomb2018
Copy link
Copy Markdown
Collaborator

@TimeBomb2018 TimeBomb2018 commented Apr 15, 2026

conditionally apply thinking parameters based on model support

Summary by Sourcery

Bug Fixes:

  • 通过在配置中使用支持标志进行保护,只为支持深度思考的模型发送思考参数,从而避免向不支持深度思考的模型发送这些参数。
Original summary in English

Summary by Sourcery

Bug Fixes:

  • Avoid sending thinking parameters to models that do not support deep thinking by guarding their inclusion with a support flag in the configuration.

@sourcery-ai
Copy link
Copy Markdown
Contributor

sourcery-ai bot commented Apr 15, 2026

审阅者指南(在小型 PR 上折叠)

审阅者指南

添加了一个能力标志检查,以便只有在模型配置显式支持思考(thinking)时才将深度思考相关参数添加到模型请求中;同时,在流式调用期间,对 Volcano 及其他提供商仍保留此前的、按提供商区分的行为。

支持思考的模型配置类图(更新版)

classDiagram
    class RedBearModelConfig {
        bool support_thinking
        bool deep_thinking
        int thinking_budget_tokens
        bool is_omni
        dict extra_params
    }

    class BaseModel {
        +get_model_params(config RedBearModelConfig) Dict
    }

    class ModelProvider {
        <<enumeration>>
        OPENAI
        VOLCANO
        DASHSCOPE
        OTHER
    }

    BaseModel --> ModelProvider : uses
    BaseModel --> RedBearModelConfig : reads

    BaseModel : +params dict
    BaseModel : +conditional thinking handling
    BaseModel : +respect config.support_thinking
    BaseModel : +Volcano streaming thinking_config in params.extra_body
    BaseModel : +Other providers thinking flags in params.model_kwargs
Loading

get_model_params 中条件性思考参数的流程图

flowchart TD
    A[Start get_model_params] --> B[Read config.support_thinking]
    B -->|False| Z[Return params without thinking parameters]
    B -->|True| C[Read is_streaming from config.extra_params.streaming]
    C -->|False| Z
    C -->|True| D[Check config.is_omni]
    D -->|True| Z
    D -->|False| E[Check provider]
    E -->|ModelProvider.VOLCANO| F[Build thinking_config]
    E -->|Other provider| K[Build model_kwargs]

    F --> G[Set thinking_config.type to enabled or disabled based on config.deep_thinking]
    G --> H{config.deep_thinking and config.thinking_budget_tokens}
    H -->|True| I[Set thinking_config.budget_tokens]
    H -->|False| J[Skip budget_tokens]
    I --> L[Set params.extra_body.thinking to thinking_config]
    J --> L

    K --> M[Set model_kwargs.enable_thinking from config.deep_thinking]
    M --> N{config.deep_thinking and config.thinking_budget_tokens}
    N -->|True| O[Set model_kwargs.thinking_budget]
    N -->|False| P[Skip thinking_budget]
    O --> Q[Set params.model_kwargs]
    P --> Q

    L --> Z
    Q --> Z
    Z[Return params]
Loading

文件级变更

变更 详情 文件
将应用深度思考相关请求参数的逻辑,放在模型配置上的显式 support_thinking 标志之后进行控制。
  • 将设置思考相关参数的逻辑包装在一个新条件中,该条件检查 config.support_thinking
  • 保留已有约束:只有对流式(streaming)、非 omni 请求才设置思考参数。
  • 对于 Volcano 提供商,当启用 deep_thinking 时,继续向 params['extra_body']["thinking"] 中填充 type 和可选的 budget_tokens
  • 对于非 Volcano 提供商,继续在 model_kwargs 中注入 enable_thinking 和可选的 thinking_budget,并将其赋值给 params['model_kwargs']
  • 确保当 support_thinkingfalse 时,无论是否为流式或是否启用 deep_thinking,都不会在请求中附加任何思考相关参数。
api/app/core/models/base.py

提示与命令

与 Sourcery 交互

  • 触发新一轮审阅: 在 pull request 中评论 @sourcery-ai review
  • 继续讨论: 直接回复 Sourcery 的审阅评论。
  • 从审阅评论生成 GitHub issue: 通过回复某条审阅评论,要求 Sourcery 从该评论创建一个 issue。你也可以回复审阅评论 @sourcery-ai issue 来从该评论创建 issue。
  • 生成 pull request 标题: 在 pull request 标题中任意位置写入 @sourcery-ai,即可随时生成标题。你也可以在 pull request 中评论 @sourcery-ai title 来(重新)生成标题。
  • 生成 pull request 摘要: 在 pull request 正文任意位置写入 @sourcery-ai summary,即可在你想要的确切位置生成 PR 摘要。你也可以在 pull request 中评论 @sourcery-ai summary 来在任意时间(重新)生成摘要。
  • 生成审阅者指南: 在 pull request 中评论 @sourcery-ai guide,即可在任意时间(重新)生成审阅者指南。
  • 一次性解决所有 Sourcery 评论: 在 pull request 中评论 @sourcery-ai resolve,即可将所有 Sourcery 评论标记为已解决。如果你已经处理完所有评论且不想再看到它们,这会很有用。
  • 一次性忽略所有 Sourcery 审阅: 在 pull request 中评论 @sourcery-ai dismiss,即可忽略所有现有的 Sourcery 审阅。特别适用于你想从头开始新一轮审阅的情况——别忘了再评论 @sourcery-ai review 来触发新的审阅!

自定义你的使用体验

访问你的 控制面板 以:

  • 启用或禁用诸如 Sourcery 自动生成的 pull request 摘要、审阅者指南等审阅功能。
  • 更改审阅语言。
  • 添加、移除或编辑自定义审阅说明。
  • 调整其他审阅设置。

获取帮助

Original review guide in English
Reviewer's guide (collapsed on small PRs)

Reviewer's Guide

Adds a capability flag check so that deep-thinking parameters are only added to model requests when the model configuration explicitly supports thinking, while preserving the previous provider-specific behavior for Volcano and other providers during streaming calls.

Updated class diagram for model config thinking support

classDiagram
    class RedBearModelConfig {
        bool support_thinking
        bool deep_thinking
        int thinking_budget_tokens
        bool is_omni
        dict extra_params
    }

    class BaseModel {
        +get_model_params(config RedBearModelConfig) Dict
    }

    class ModelProvider {
        <<enumeration>>
        OPENAI
        VOLCANO
        DASHSCOPE
        OTHER
    }

    BaseModel --> ModelProvider : uses
    BaseModel --> RedBearModelConfig : reads

    BaseModel : +params dict
    BaseModel : +conditional thinking handling
    BaseModel : +respect config.support_thinking
    BaseModel : +Volcano streaming thinking_config in params.extra_body
    BaseModel : +Other providers thinking flags in params.model_kwargs
Loading

Flow diagram for conditional thinking parameters in get_model_params

flowchart TD
    A[Start get_model_params] --> B[Read config.support_thinking]
    B -->|False| Z[Return params without thinking parameters]
    B -->|True| C[Read is_streaming from config.extra_params.streaming]
    C -->|False| Z
    C -->|True| D[Check config.is_omni]
    D -->|True| Z
    D -->|False| E[Check provider]
    E -->|ModelProvider.VOLCANO| F[Build thinking_config]
    E -->|Other provider| K[Build model_kwargs]

    F --> G[Set thinking_config.type to enabled or disabled based on config.deep_thinking]
    G --> H{config.deep_thinking and config.thinking_budget_tokens}
    H -->|True| I[Set thinking_config.budget_tokens]
    H -->|False| J[Skip budget_tokens]
    I --> L[Set params.extra_body.thinking to thinking_config]
    J --> L

    K --> M[Set model_kwargs.enable_thinking from config.deep_thinking]
    M --> N{config.deep_thinking and config.thinking_budget_tokens}
    N -->|True| O[Set model_kwargs.thinking_budget]
    N -->|False| P[Skip thinking_budget]
    O --> Q[Set params.model_kwargs]
    P --> Q

    L --> Z
    Q --> Z
    Z[Return params]
Loading

File-Level Changes

Change Details Files
Gate the application of deep-thinking-related request parameters behind an explicit support_thinking flag on the model configuration.
  • Wrap the logic that sets thinking-related params in a new conditional checking config.support_thinking.
  • Retain the existing requirement that thinking params are only set for streaming, non-omni requests.
  • For Volcano provider, continue to populate params['extra_body']["thinking"] with type and optional budget_tokens when deep_thinking is enabled.
  • For non-Volcano providers, continue to inject enable_thinking and optional thinking_budget into model_kwargs and assign them to params['model_kwargs'].
  • Ensure that when support_thinking is false, no thinking-related parameters are attached to the request, regardless of streaming or deep_thinking settings.
api/app/core/models/base.py

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Copy Markdown
Contributor

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - 我在这里给出了一些整体性的反馈:

  • 建议将嵌套的 if config.support_thinking / is_streaming / provider 条件判断逻辑进行扁平化处理(例如使用提前返回或拆分为单独的辅助函数),以便让关于“何时应用 thinking 参数”的控制流更容易理解和维护。
给 AI Agent 的提示
Please address the comments from this code review:

## Overall Comments
- Consider flattening the nested `if config.support_thinking` / `is_streaming` / `provider` checks (e.g., early returns or separate helper functions) to make the control flow around when thinking parameters are applied easier to follow.

Sourcery 对开源项目是免费的——如果你觉得我们的评审有帮助,欢迎分享 ✨
帮我变得更有用!请对每条评论点 👍 或 👎,我会根据你的反馈改进后续的评审质量。
Original comment in English

Hey - I've left some high level feedback:

  • Consider flattening the nested if config.support_thinking / is_streaming / provider checks (e.g., early returns or separate helper functions) to make the control flow around when thinking parameters are applied easier to follow.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- Consider flattening the nested `if config.support_thinking` / `is_streaming` / `provider` checks (e.g., early returns or separate helper functions) to make the control flow around when thinking parameters are applied easier to follow.

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

@TimeBomb2018 TimeBomb2018 merged commit 2e1470c into release/v0.3.0 Apr 15, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant