Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Chatbot 'Stop' Button Causes ResponseAborted Error #5459

Closed
brandonhays3 opened this issue Mar 30, 2025 · 11 comments
Closed

Chatbot 'Stop' Button Causes ResponseAborted Error #5459

brandonhays3 opened this issue Mar 30, 2025 · 11 comments
Assignees
Labels
ai/ui bug Something isn't working

Comments

@brandonhays3
Copy link

brandonhays3 commented Mar 30, 2025

Description

It looks like the chatbot should call onFinish with result.finishReason === "stop" when the user stops the chat, but instead, it's throwing a "ResponseAborted" error. This prevents chat messages from being saved correctly since onFinish is not called.

You can reproduce this issue on chat.vercel.ai by requesting a long response and then pressing the stop button. Instead of handling the stop gracefully, it throws an error.

The expected behavior should be:

When the user stops the chat, the streaming process should terminate cleanly.

onFinish should be called with { finishReason: "stop" }, ensuring messages are saved properly.

The app should not throw a "ResponseAborted" error.

Code example

No response

AI provider

No response

Additional context

No response

@brandonhays3 brandonhays3 added the bug Something isn't working label Mar 30, 2025
@lgrammel
Copy link
Collaborator

Can you tell me more about your environment and how to reproduce?

@lgrammel lgrammel added the ai/ui label Mar 31, 2025
@tunde-alao
Copy link

Hi, I'm also experiencing the same issue on my end. I'm using Next.js, and simple starting a chat and stopping by calling the stop() from useChat

@lgrammel
Copy link
Collaborator

@tunde-alao which next js version are you using? which @ai-sdk/react version?

@tunde-alao
Copy link

@lgrammel I'm currently using @ai-sdk/[email protected]

@tunde-alao
Copy link

I just updated to @ai-sdk/[email protected], and the same issue is still there. This only happens when result.consumeStream(); is present before return result.toDataStreamResponse(). But we need to add that so that onSave is called when when the chat streaming is stopped.

@brandonhays3
Copy link
Author

You can reproduce the issue with Vercel's AI example chatbot: https://vercel.com/templates/next.js/nextjs-ai-chatbot
https://github.com/vercel/ai-chatbot/blob/main/app/(chat)/api/chat/route.ts

(My project is setup similarly, with the latest version installed)

To reproduce:

  1. Ask the chatbot to "write an essay about nextjs" on https://chat.vercel.ai/
  2. Pause it before it completes
  3. Then ask it a follow up question
  4. You will notice it says "Oops, an error occurred!"
  5. The user cannot send any further messages, and the saving logic doesn't save the assistant's partial response

This is because an error is getting thrown, but I am assuming the library is setup to call onFinish instead, given that onFinish returns a result with finishReason with possible value "stop.".

Ideally, when a user leaves the page or presses the stop button or the stream stops, it should call onFinish.

This worked properly when my project was on version 4.1.21 but no longer works when I upgraded to the latest version.

@iteratetograceness iteratetograceness self-assigned this Mar 31, 2025
@iteratetograceness
Copy link
Collaborator

Hi @brandonhays3 + @tunde-alao, would the configuration outlined in the example below reflect what you're aiming to do?

Upon stop, the useChat hook does reset the status to ready and should allow subsequent messages! I'm working towards a repro to better understand but let me know if the following works for you:

Page: examples/next-openai/app/use-chat-resilient-persistence/[id]/chat.tsx
API Route: examples/next-openai/app/api/use-chat-persistence/route.ts

@brandonhays3
Copy link
Author

Hey @iteratetograceness,

I looked at your example route. It seems that your route doesn't use an abort signal on the streamText method: https://sdk.vercel.ai/docs/advanced/stopping-streams

When I removed the abort signal from the streamText method, it correctly handles the error (as a side effect, it still completes the LLM call because it doesn't forward that abort signal which is why I had the abort signal there). That must mean that the issue occurs when there is an abort signal provided to streamText.

I would assume there is some issue in the logic of the abort signal in the stream where it's throwing an error when an abort signal is called instead of correctly handling it.

@iteratetograceness
Copy link
Collaborator

Ah good catch! I narrowed down the issue, should be resolved once this ships :)

@iteratetograceness
Copy link
Collaborator

Following up, @brandonhays3 and @tunde-alao!

While the PR above will address the network error caused by the unhandled abort error when using abortSignal + consumeStream, it doesn't actually address what I believe you're aiming to achieve: aborting the API call to the provider and saving partial responses.

The intent of consumeStream is to allow the response to stream fully and trigger the onFinish (if present); it does not support invoking onFinish if the stream is cut short by an abort signal.

@iteratetograceness
Copy link
Collaborator

Resolved by #5492

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai/ui bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants