Skip to content

Conversation

@codefromthecrypt
Copy link
Contributor

@codefromthecrypt codefromthecrypt commented Nov 3, 2025

This updates to the much simpler llama-stack setup which always enables otel and also has built-in openai remote provider. This means it works like all the least config others. The biggest win is you can use non-llama models!

Draft until 0.3.2 or 0.4.0 is out

chat completions API
Screenshot 2025-11-03 at 3 23 03 PM
responses API
Screenshot 2025-11-03 at 3 23 22 PM
responses API + MCP
Screenshot 2025-11-03 at 4 47 10 PM

@codefromthecrypt
Copy link
Contributor Author

blocked on openai/openai-agents-python#2034 for the responses API+MCP works

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant