Skip to content

feat: add reasoning output types to OpenAI Responses API spec#5357

Merged
leseb merged 4 commits intollamastack:mainfrom
robinnarsinghranabhat:feat/reasoning-types-only
Mar 31, 2026
Merged

feat: add reasoning output types to OpenAI Responses API spec#5357
leseb merged 4 commits intollamastack:mainfrom
robinnarsinghranabhat:feat/reasoning-types-only

Conversation

@robinnarsinghranabhat
Copy link
Copy Markdown
Contributor

Splitting #5206 into two stages. CI integration tests and recordings are causing issues when API Spec and core implementation changes are pushed together.

Stage 1 (this PR): Introduce API types required for reasoning support in the Responses API:

  • ReasoningItem, ReasoningContent, ReasoningSummary output types
  • summary field on OpenAIResponseReasoning
  • An internal AssistantMessageWithReasoning type

Stage 2: After this PR is merged, will create a clean version of #5206 with core implementation for reasoning propagation.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Mar 29, 2026
@github-actions
Copy link
Copy Markdown
Contributor

github-actions bot commented Mar 29, 2026

✱ Stainless preview builds

This PR will update the llama-stack-client SDKs with the following commit message.

feat: add reasoning output types to OpenAI Responses API spec
⚠️ llama-stack-client-openapi studio · code

Your SDK build had at least one "warning" diagnostic.
generate ⚠️

⚠️ llama-stack-client-python studio · conflict

Your SDK build had at least one warning diagnostic.

⚠️ llama-stack-client-node studio · conflict

Your SDK build had at least one warning diagnostic.

⚠️ llama-stack-client-go studio · conflict

Your SDK build had at least one error diagnostic.


This comment is auto-generated by GitHub Actions and is automatically kept up to date as you push.
If you push custom code to the preview branch, re-run this workflow to update the comment.
Last updated: 2026-03-31 09:17:49 UTC

@robinnarsinghranabhat
Copy link
Copy Markdown
Contributor Author

@cdoern To let you know, the stainless job is still failing

@cdoern
Copy link
Copy Markdown
Collaborator

cdoern commented Mar 30, 2026

@mattf PTAL since this splits out #5206

Copy link
Copy Markdown
Collaborator

@cdoern cdoern left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this will help an actual implementation PR like #5206 pass, the reason that other one is failing is because we don't have a way to both re-record AND test a new llama-stack-client in the same PR. lets get this in to unblock other PRs.

@skamenan7
Copy link
Copy Markdown
Contributor

is ConversationItem in conversations/models.py kept in sync with OpenAIResponseOutput manually? it looks like it doesn't include ReasoningItem yet. is that intentional for Stage 1, or should it be updated here too?

@skamenan7
Copy link
Copy Markdown
Contributor

convert_response_input_to_chat_messages() in utils.py has no branch for ReasoningItem and no exhaustive fallback raise. once Stage 2 emits reasoning items, feeding a previous_response_id into a new turn will silently drop them. worth adding the branch now (even as a pass-through or explicit skip), plus an else: raise on unknown types so future union growth doesn't fail silently.

Copy link
Copy Markdown
Contributor

@skamenan7 skamenan7 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM apart from few comments. Thanks.

@robinnarsinghranabhat
Copy link
Copy Markdown
Contributor Author

convert_response_input_to_chat_messages() in utils.py has no branch for ReasoningItem and no exhaustive fallback raise. once Stage 2 emits reasoning items, feeding a previous_response_id into a new turn will silently drop them. worth adding the branch now (even as a pass-through or explicit skip), plus an else: raise on unknown types so future union growth doesn't fail silently.

@skamenan7 Yeah addressed them. Makes sense to raise the exception

@robinnarsinghranabhat
Copy link
Copy Markdown
Contributor Author

robinnarsinghranabhat commented Mar 30, 2026

is ConversationItem in conversations/models.py kept in sync with OpenAIResponseOutput manually? it looks like it doesn't include ReasoningItem yet. is that intentional for Stage 1, or should it be updated here too?

For scope this PR, I have only considered case when user would supply conversation history manually.
It's because, to support use of conversation id in responses, in the responses layer, we need to emit a missing event response.output_item.done. While that is not trivial, I want another PR to address issues like :

  • missing events
  • incorrect ordering of events

I opened #5308 PR along that same line of fix.

# skip as these have been extracted and inserted in order
pass
elif isinstance(input_item, OpenAIResponseOutputMessageReasoningItem):
# skip for now — reasoning items will be handled in Stage 2
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what's stage 2? is there an issue for this?

Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok it's #5206 - rebased

@leseb leseb added this pull request to the merge queue Mar 31, 2026
Merged via the queue into llamastack:main with commit 406f69e Mar 31, 2026
72 of 73 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants