You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When streaming from a Command R or Command R+ endpoint deployed on Azure, after all the content-delta chunks of the streaming response are received, httpx errors with RemoteProtocolError.
** Buggy code **
importcohereco=cohere.ClientV2(api_key=os.getenv('COMMAND_R_API_KEY'), base_url=os.getenv('COMMAND_R_URL'))
res=co.chat_stream(
model="command-r-plus-08-2024",
messages=[{"role": "user", "content": "What is an LLM?"}],
)
foreventinres:
ifevent:
ifevent.type=="content-delta":
print(event.delta.message.content.text, end="")
** Detailed description **
During the code execution all the content chunks are returned properly and printed out. Afterwards, httpx return a RemoteProtocolError. I am using server deployments on Azure AI Foundry.
SDK Version: 5.13.11
** Bug Summary **
When streaming from a Command R or Command R+ endpoint deployed on Azure, after all the
content-delta
chunks of the streaming response are received, httpx errors with RemoteProtocolError.** Buggy code **
** Detailed description **
During the code execution all the content chunks are returned properly and printed out. Afterwards, httpx return a RemoteProtocolError. I am using server deployments on Azure AI Foundry.
** Full stack trace **
The text was updated successfully, but these errors were encountered: