You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
fix: emit response.output_text.done streaming event per OpenAI spec
The LlamaStack server was missing the `response.output_text.done`
streaming event, which the OpenAI Responses API spec requires between
`output_text.delta` and `content_part.done`. This event carries the
final accumulated text and logprobs.
Discovered by comparing streaming event sequences between OpenAI's
gpt-5.1 (ground truth) and LlamaStack server output using the OpenAI
Python client.
Changes:
- streaming.py: Import and emit OutputTextDone with final text and
logprobs before content_part.done
- openai_responses.py: Add logprobs field to OutputTextDone type
definition (required per OpenAI spec)
- test_openai_responses.py: Verify output_text.done is emitted with
correct fields and ordering (before content_part.done)
0 commit comments