-
Bug reportDescribe the bugWhen using Next's API routes, chunks that are written with To ReproduceSteps to reproduce the behavior, please provide code snippets or a repository:
Expected behaviorThe route sends a new event to the connection every second. Actual behaviorThe route doesn't send any data to the connection unless a call to System information
Additional contextWhen using other HTTP frameworks (Express, Koa, I'd hazard a guess that #5855 was caused by the same issue, but considered unrelated because the issue was obscured by the There are also two Spectrum topics about this (here and here) that haven't garnered much attention yet. Supporting Websockets and SSE in Next API routes may be related, but fixing support for SSE should be a lower barrier than adding support Websockets. All of the inner workings are there, we just need to get the plumbing repaired. |
Beta Was this translation helpful? Give feedback.
Replies: 55 comments 59 replies
-
I forgot to mention that this works in Micro routes, as well. I'm trying to eliminate the need for my Micro API by moving everything into Next, but this is a blocker for me. |
Beta Was this translation helpful? Give feedback.
-
You can use a custom server.js to workaround this for now: require('dotenv').config();
const app = require('express')();
const server = require('http').Server(app);
const next = require('next');
const DSN = process.env.DSN || 'postgres://postgres:postgres@localhost/db';
const dev = process.env.NODE_ENV !== 'production';
const nextApp = next({ dev });
const nextHandler = nextApp.getRequestHandler();
nextApp.prepare().then(() => {
app.get('*', (req, res) => {
if (req.url === '/stream') {
res.writeHead(200, {
Connection: 'keep-alive',
'Cache-Control': 'no-cache',
'Content-Type': 'text/event-stream',
});
res.write('data: Processing...\n\n');
setTimeout(() => {
res.write('data: Processing2...\n\n');
}, 10000);
} else {
return nextHandler(req, res);
}
});
require('../websocket/initWebSocketServer')(server, DSN);
const port = 8080;
server.listen(port, err => {
if (err) throw err;
console.log('> Ready on http://localhost:' + port);
});
}); componentDidMount() {
this.source = new EventSource('/stream')
this.source.onmessage = function(e) {
console.log(e)
}
} |
Beta Was this translation helpful? Give feedback.
-
I would still recommend to keep any server sent event and websocket handlers in separate processes in production. It's very likely that the frequency of updates to those parts of the business logic are quite different. Your front-end most likely changes more often than the types of events you handle / need to push to the clients from the servers. If you only make changes to one, you probably don't want to restart the processes responsible for the other(s). Better to keep the connections alive rather than cause a flood of reconnections / server restarts for changes which have no effect. |
Beta Was this translation helpful? Give feedback.
-
@msand The main reason I'm trying to avoid using a custom server is that I'm deploying to Now. Using a custom server would break all of the wonderful serverless functionality I get there. Your second point is fair. What I'm trying to do is create an SSE stream for data that would otherwise be handled with basic polling. The server is already dealing with constant reconnections in that case, so an SSE stream actually results in fewer reconnections. I suppose I could set up a small webserver in the same repo that just uses a separate Now builder. That would allow the processes to remain separate, though it'd still cause all of the SSE connections to abort and reconnect when there are any changes to the project. Even with those points, I can see plenty of scenarios in which it makes sense to be able to run an SSE endpoint from one of Next's API routes. Additionally, in the docs it's specifically stated that...
Since it's specifically stated that |
Beta Was this translation helpful? Give feedback.
-
@trezy It seems the issue is that the middleware adds a gzip encoding which the browser has negotiated using the header:
If you add
then it seems to work: res.writeHead(200, {
Connection: 'keep-alive',
'Content-Encoding': 'none',
'Cache-Control': 'no-cache',
'Content-Type': 'text/event-stream',
}); |
Beta Was this translation helpful? Give feedback.
-
Alternatively, gzip your content |
Beta Was this translation helpful? Give feedback.
-
Oh, that's super interesting! I'll give that a shot and report back. In the meantime, it'd still be nice for this quirk (and any similar ones) to be noted somewhere in the docs. |
Beta Was this translation helpful? Give feedback.
-
Yeah, it's more a consequence of having some helpers, would be nice with a mode which can turn all of it off, and only makes it a plain req res pair |
Beta Was this translation helpful? Give feedback.
-
Actually, this seems to be documented here: https://github.com/expressjs/compression#server-sent-events Have to call res.flush() when you think there's enough data for the compression to work efficiently export default (req, res) => {
res.writeHead(200, {
'Cache-Control': 'no-cache',
'Content-Type': 'text/event-stream',
});
res.write('data: Processing...');
/* https://github.com/expressjs/compression#server-sent-events
Because of the nature of compression this module does not work out of the box with
server-sent events. To compress content, a window of the output needs to be
buffered up in order to get good compression. Typically when using server-sent
events, there are certain block of data that need to reach the client.
You can achieve this by calling res.flush() when you need the data written to
actually make it to the client.
*/
res.flush();
setTimeout(() => {
res.write('data: Processing2...');
res.flush();
}, 1000);
}; |
Beta Was this translation helpful? Give feedback.
-
It then applies gzip compression for you |
Beta Was this translation helpful? Give feedback.
-
I have switched to using a custom express server. That's the only way I
could get it to work. I guess that's cool since I can do more with express.
Before deciding to integrate express, I had tried the things mentioned
above, none worked.
1. Turned off gzip compression by setting the option in next.config.js. The
behavior remained the same. I inspected the headers on the client (using
postman) and confirmed the gzip encoding was removed, but that didn't seem
to fix the problem.
2. Calling res.flush had no effect either. Instead I get a warning in the
console that flush is deprecated and to use flushHeaders instead. But
that's not what I want.
This is a rather strange bug.. 😔
…On Thursday, 9 January 2020, Mikael Sand ***@***.***> wrote:
It then applies gzip compression for you
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#9965>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AHJC5ZDO2YUJFN5JLQPLYGLQ46QBFANCNFSM4KDHWFMA>
.
|
Beta Was this translation helpful? Give feedback.
-
I have been trying to get SSE work in nextjs, but could not get working. With custom server and native node httpServer req,res it works, but with Nextjs 'res', no messages are sent to the client. |
Beta Was this translation helpful? Give feedback.
-
Hey @kavuri. It is possible to integrate a custom Node.js server (e.g using express) with your next.js app. That way, you can still get Server-Side Rendering without these Next.js limitations. See this page of the official documentation for details: https://nextjs.org/docs/advanced-features/custom-server Also, check out how I implemented this in my own app which I mentioned in the comment above yours: https://github.com/uxFeranmi/react-woocommerce/blob/master/server.js |
Beta Was this translation helpful? Give feedback.
-
@uxFeranmi I could use the custom server method as mentioned here https://nextjs.org/docs/advanced-features/custom-server to write messages as res.write(...). But in the Next app, I do not see any messages in my page
index.js
My custom server.js
I am not getting any message in the index page. But if I open the url |
Beta Was this translation helpful? Give feedback.
-
I have created a small test to trigger the event. Basically, just create a file in the project root directory (say just |
Beta Was this translation helpful? Give feedback.
-
After many many many many many hours, I've finally gotten this to work on my setup. Hopefully this can help someone 🎉 I'm using App router along with an api route.ts that "proxies" sse from my actual backend server to the nextjs server. The only issue that I ended up having is that when the UI refreshes, the server does not know that it should close the current EventSource and move to a new one. The following solution solves this issue. The following code belongs inside the
|
Beta Was this translation helpful? Give feedback.
-
This SSE works in local development but when I am publishing to vercel it gives whole response after the openai SSE is completed. Can someone please help me. This codes are developed in nextjs backend code:
front-end code:
|
Beta Was this translation helpful? Give feedback.
-
Trying to implement something where I send status of server to client as it is working through logic. I'm using app router. My problem is that I have to use all these nested `.then()' statements to get it working. Using await anywhere either doesn't stream any response with the connection open, or it sends the full stream after everything is processed instead of in chunks. Does anyone know what I am doing wrong? Here is my page.tsx 'use client'
import { Button } from '@/components/ui/button'
import EventSource from 'eventsource'
export default function Stream() {
function SSE() {
// Create a new EventSource instance
const eventSource = new EventSource('http://localhost:3000/api/stream')
// Handle an open event
eventSource.onopen = (e) => {
console.log('Connection to server opened')
}
// Handle a message event
eventSource.onmessage = (e) => {
const data = JSON.parse(e.data)
console.log('New message from server:', data)
}
// Handle an error event (or close)
eventSource.onerror = (e) => {
console.log('EventSource closed:', e)
eventSource.close() // Close the connection if an error occurs
}
// Cleanup function
return () => {
eventSource.close()
}
}
return (
<div>
<h1>Server-Sent Events (SSE) Demo</h1>
<Button onClick={SSE}>Initiate</Button>
</div>
)
} Here is my route.ts: import { NextRequest } from 'next/server'
export const runtime = 'nodejs'
// This is required to enable streaming
export const dynamic = 'force-dynamic'
export async function GET(request: NextRequest) {
let responseStream = new TransformStream()
const writer = responseStream.writable.getWriter()
const encoder = new TextEncoder()
// Close if client disconnects
request.signal.onabort = () => {
console.log('closing writer')
writer.close()
}
// Function to send data to the client
function sendData(data: any) {
const formattedData = `data: ${JSON.stringify(data)}\n\n`
writer.write(encoder.encode(formattedData))
}
// Initial Progress
sendData({ progress: '0%' })
// 50% done
const Note = getNote()
Note.then((Note) => {
sendData({ progress: '50%' })
// 100% done
const Todo = getTodo()
Todo.then((Todo) => {
sendData({ progress: '100%' })
// close writer
writer.close()
})
})
return new Response(responseStream.readable, {
headers: {
'Content-Type': 'text/event-stream',
Connection: 'keep-alive',
'Cache-Control': 'no-cache, no-transform'
}
})
}
async function getNote() {
await delay(3000)
return 'I am a Note'
}
async function getTodo() {
await delay(3000)
return 'I am a Todo'
}
function delay(ms: number) {
return new Promise((resolve) => setTimeout(resolve, ms))
} |
Beta Was this translation helpful? Give feedback.
-
you should return a response at first to establish an Eventsource connection, then you can send message by this connection, it's a simple SSE controller below
|
Beta Was this translation helpful? Give feedback.
-
Maybe someone finds it helpful and relevant. I wasn't able to get this thing to work based on the above. My case was that I had endpoint sending event stream formatted data, which I had to proxy through route handler to the client. (this way I could get rid of NEXT_PUBLIC_ prefix which was giving a headache on the pipeline, production.
This is my "proxy" route
Also middleware.ts is needed in src/ or in the root if not using src/app but just app/
|
Beta Was this translation helpful? Give feedback.
-
Hi all! I know this is quite an old thread, but anyways... I got SSE working on Next 14 (App router) but when i build and start the app, the server kina hangs with the warning Here is my route.ts, if anyone does have any suggestions I would take them :)
|
Beta Was this translation helpful? Give feedback.
-
Ok I made this work using Upstash Redis as an intermediary and it works great. A few things I missed initially
|
Beta Was this translation helpful? Give feedback.
-
Hi everyone, I demo some code, it works, I tried. It seems it is much smooth by using SSE comparing with Vercel ai package's StreamableValue base on RSC. Here is the code: app/api/chat/route.ts. For the client you can check here: features/chat-bot/hooks/use-sse-message.tsx import { getOpenaiClient } from '@/features/chat-bot/utils/openai-client'
import { logger } from '@/lib/shared'
export const runtime = 'edge'
// Prevents this route's response from being cached
export const dynamic = 'force-dynamic'
type RequestData = {
currentModel: string
// message: { context: string; role: 'user' | 'assistant' }[]
message: any
}
export async function POST(request: Request) {
const { message } = (await request.json()) as RequestData
logger.trace({ message }, 'post message')
if (!message || !Array.isArray(message)) {
return new Response('No message in the request', { status: 400 })
}
try {
const openai = getOpenaiClient()
const completionStream = await openai.chat.completions.create({
model: 'gpt-3.5-turbo',
messages: [
{
role: 'system',
content: 'You are a smart AI bot and you can answer anything!',
},
...message,
],
max_tokens: 4096,
stream: true,
})
const responseStream = new ReadableStream({
async start(controller) {
const encoder = new TextEncoder()
for await (const part of completionStream) {
const text = part.choices[0]?.delta.content ?? ''
// todo add event
const chunk = encoder.encode(`data: ${text}\n\n`)
controller.enqueue(chunk)
}
controller.close()
},
})
return new Response(responseStream, {
headers: {
'Content-Type': 'text/event-stream; charset=utf-8',
Connection: 'keep-alive',
'Cache-Control': 'no-cache, no-transform',
},
})
} catch (error) {
console.error('An error occurred during OpenAI request', error)
return new Response('An error occurred during OpenAI request', {
status: 500,
})
}
} |
Beta Was this translation helpful? Give feedback.
-
I don't know if anybody is still struggling with this, but I had the same sort of issues with streaming audio from OpenAI text to voice (and I would assume it would be an issue with all streaming audio/video). What I ended up doing was setting up a sort of virtual stream; in my case, it made sense to do some of it on the client side bc of routing issues. So what I did was use regex to convert a long article (maybe 50k characters) into 1000 (more or less bc I don't want to break words or sentences up) characters on the client side and then send those all to the server continuously, which converts to audio and then sends back to client in chunks. The client starts to play the audio once the first chunk is received and compiles the rest of the chunks in the background. There is some lag, but that's just due to the lag in the OpenAI api. |
Beta Was this translation helpful? Give feedback.
-
I've been working a bit with this recently and wrote a blog post on how to stream JSON and text in a single request (for e.g RAG applications where you want to include sources along your generated answer). Hopefully it can be of help: |
Beta Was this translation helpful? Give feedback.
-
Hey Guys, I'm trying to update a UI component with the result of a One That is when sh.. hits the fan, every time I fire the The only time it works is when And I know for sure that, no abort signal was triggered (due to the numerous console.logs I've added everywhere), and I can see in the network tab in developer tools that the connection is still open. UPDATE: Below is the 3 files involved in the process: (thanks @ajayvignesh01 and @StephenGrider for getting me started)
import { useEffect, useState } from 'react';
import { useParams } from 'next/navigation';
const RealtimeUpdates = () => {
const { id } = useParams();
const [events, setEvents] = useState<any[]>([]);
useEffect(() => {
const eventSource = new EventSource(
`http://localhost:3000/api/events/${id}`
);
eventSource.onmessage = function (event) {
const newEvent = JSON.parse(event.data);
setEvents((prevEvents: any[]) => [...prevEvents, newEvent]);
};
// Handle an error event (or close)
eventSource.onerror = (e) => {
eventSource.close(); // Close the connection if an error occurs
};
return () => {
eventSource.close();
};
}, [id]);
return (
<div>
<h1>Events</h1>
<ul>
{events.map((event, index) => (
<li key={index}>{JSON.stringify(event)}</li>
))}
</ul>
</div>
);
};
export default RealtimeUpdates;
import { NextRequest } from 'next/server';
export const runtime = 'edge';
export const dynamic = 'force-dynamic';
const writers: Record<string, WritableStreamDefaultWriter> = {};
export function sendData(cmpID: string, data: any) {
const encoder = new TextEncoder();
const writer = writers[cmpID];
if (!writer) {
console.error('Writer is not initialized for client:', cmpID);
return;
}
const formattedData = `data: ${JSON.stringify(data)}\n\n`;
writer.write(encoder.encode(formattedData));
}
export async function GET(request: NextRequest, context: any) {
const cmpID = context.params.id.toString();
let responseStream = new TransformStream();
writers[cmpID] = responseStream.writable.getWriter();
// Close if client disconnects
request.signal.onabort = () => {
writers[cmpID]?.close();
delete writers[cmpID];
};
// Send random Data
sendData(cmpID, { progress: '0%' }); // <-- this works
return new Response(responseStream.readable, {
headers: {
'Content-Type': 'text/event-stream',
Connection: 'keep-alive',
'Cache-Control': 'no-cache, no-transform',
},
});
}
import { sendData } from '../../events/[id]/route';
...
export async function POST(req: NextRequest) {
// the webhook hits this
sendData(cmpID, createdFiles); // <-- this doesn't work because it can't find the cmpID in the writers object
...
} Maybe this is not at all the approach for these type of needs but I feel like this would be an amazing thing to solve and would be super helpful to a lot of people. Thanks, -- Latif |
Beta Was this translation helpful? Give feedback.
-
@latifs The thought process for resolving this issue is fundamentally sound, but it can be divided into two distinct scenarios:
In your research, everything except for the webhook part seems promising for implementing the first scenario. However, for the second scenario, we need to target a specific client and update their UI based on their unique context. Let's review the implementation to understand how to address the encountered issue. Implementation of SSE based on a WebhookLet's assume an external 3rd Party Service triggers our POST request endpoint, and we don't have access to its headers. This is mentioned because SSE can also be implemented in a POST request if we control the headers. For more reference checkout this blog post. Now coming to the implementation when using webhooks(external triggered POST Request) Step 1: 'use client'
import { useEffect, useState } from 'react'
const DeploymentStatus: React.FC = () => {
const [status, setStatus] = useState('loading')
useEffect(() => {
const eventSource = new EventSource(`/api/sse/${1}`) //configure this based on your user case, for demo purpose I'm using static value
eventSource.onmessage = event => {
const data = event.data && JSON.parse(event?.data)
if (data.success) {
setStatus('success')
eventSource.close()
}
}
eventSource.onerror = () => {
eventSource.close()
}
return () => {
eventSource.close()
}
}, [])
return (
<div>
{status === 'loading' ? (
<div>Loading...</div>
) : (
<div>Deployment Successful!</div>
)}
</div>
)
}
export default DeploymentStatus Step 02: import { NextRequest } from 'next/server'
import { addClient, removeClient } from '@/lib/clients'
export async function GET(req: NextRequest, context: any) {
const id = context.params.id.toString()
const { readable, writable } = new TransformStream()
const writer = writable.getWriter()
const encoder = new TextEncoder()
writer.write(encoder.encode('data: \n\n'))
const clientId = id
// for now, we are saving the details in RAM, here we can replace with REDIS if needed
// with this addClient function, now we have access to the original writer in our RAM
// For more info, you can look into @/lib/client.ts file
addClient(clientId, {
write: (message: string) => writer.write(encoder.encode(message)),
end: () => writer.close(),
})
req.signal.addEventListener('abort', () => {
removeClient(clientId)
writer.close()
})
return new Response(readable, {
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
Connection: 'keep-alive',
},
})
} Step 03: type Client = {
id: number
res: any
}
const clients: Client[] = []
// the res object here is the writer, so now we can use it in our webhook
// look at the sendMessageToClient function below
export function addClient(id: number, res: any): void {
clients.push({ id, res })
}
export function removeClient(id: number): void {
const index = clients.findIndex(client => client.id === id)
if (index !== -1) {
clients.splice(index, 1)
}
}
export function sendMessageToClient(id: number, message: string): void {
const client = clients.find(client => client.id === id)
if (client) {
client.res.write(`data: ${message}\n\n`)
}
}
export function getClientsCount(): number {
return clients.length
} Step 04: import { NextRequest, NextResponse } from 'next/server'
import { sendMessageToClient } from '@/lib/clients'
export async function POST(req: NextRequest) {
const event = await req.json()
if (event.type === 'deployment.success') {
// Assuming event contains client ID
const clientId = event.clientId
sendMessageToClient(clientId, JSON.stringify({ success: true }))
}
return NextResponse.json({ received: true })
} To check
{
"type": "deployment.success",
"clientId": "1"
}
SummeryThe main component that facilitates broadcasting messages from the webhook is the |
Beta Was this translation helpful? Give feedback.
-
SSE have to be dealt with all within the same request on Vercel as far as I can tell. I can't access the writer of the original GET request through a singleton of client handlers! The singleton gets destroyed on further requests... So I probably need to rethink the way I am approaching this! |
Beta Was this translation helpful? Give feedback.
-
Using the approach suggested by @leerob and others here, we're encountering issues with cache revalidation. It worked fine until Next 14.2.11, and the changes as of that release have broken revalidation in callbacks. An abstract example of the pattern: export const GET = async () => {
const handleStreamEnded = async () => {
await mutateData(); // here we mutate data which is cached via unstable_cache() here
revalidateTag(...); // <-- this seems to be ignored as of Next 14.2.11
};
... // here is the streaming implementation, omitted for brevity, which calls handleStreamEnded()
return new Response(responseStream, { headers: { ... } });
}; Cache revalidation still works in the main body of the Can anyone speak to whether this is a bug and this usage of cache revalidation is supposed to be supported, or whether it was a coincidence that this worked until Next 14.2.11 and it shouldn't be used this way? And if the latter, how can cache revalidation be achieved when doing streaming in a route handler? |
Beta Was this translation helpful? Give feedback.
-
I found the easiest way is to use pusher. I couldn't get anything else within vercel to work consistently. Implementing pusher (or Ably or other alternatives). |
Beta Was this translation helpful? Give feedback.
For those stumbling onto this through Google, this is working as of Next.js 13 + Route Handlers: