"Stream not supported for function calling agent" #14653
Replies: 5 comments 1 reply
-
@wuchengyuan88 Hello there! I'm here to assist you with any bugs, questions, or becoming a contributor. Let me know how I can help! Streaming output is currently not supported for function-calling agents in LlamaIndex, as indicated by the |
Beta Was this translation helpful? Give feedback.
-
fyi #15079 |
Beta Was this translation helpful? Give feedback.
-
fyi #15653 |
Beta Was this translation helpful? Give feedback.
-
any updates? |
Beta Was this translation helpful? Give feedback.
-
@jacksonwen001 @mattf @wuchengyuan88 it probably won't be implemented, due to a LOT of complexities around exposing a generator to the user. To do this, you need to detect when a tool is not being called, which is different for nearly every LLM The updated way to do this would be to use the new https://docs.llamaindex.ai/en/stable/examples/agent/agent_workflow_basic/ |
Beta Was this translation helpful? Give feedback.
-
Just a quick question, will streaming output be eventually supported for function-calling agents in later versions of LlamaIndex?
Or is it permanently unsupported?
Currently, the error message when I try agent.stream_chat/ agent.astream_chat is "Stream not supported for function calling agent".
Thank you.
Beta Was this translation helpful? Give feedback.
All reactions