- Sponsor
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Open
supabase/node-postgres
#10Labels
Description
Using the "pg" package 8.5.1
To replicate the issue, save some large data as a blob into the database. In my case 340 MB was enough to trigger the bug. Then try to read back the data.
It will throw this uncatchable error that crashes the application:
Error: Cannot create a string longer than 0x1fffffe8 characters
at Buffer.utf8Slice (<anonymous>)
at Object.slice (node:buffer:593:37)
at Buffer.toString (node:buffer:811:14)
at BufferReader.string (/home/joplin/packages/server/node_modules/pg-protocol/src/buffer-reader.ts:35:32)
at Parser.parseDataRowMessage (/home/joplin/packages/server/node_modules/pg-protocol/src/parser.ts:274:51)
at Parser.handlePacket (/home/joplin/packages/server/node_modules/pg-protocol/src/parser.ts:172:21)
at Parser.parse (/home/joplin/packages/server/node_modules/pg-protocol/src/parser.ts:101:30)
at Socket.<anonymous> (/home/joplin/packages/server/node_modules/pg-protocol/src/index.ts:7:48)
at Socket.emit (node:events:390:28)
at addChunk (node:internal/streams/readable:315:12)
As a test I've tried to make it reject the promise when parser.parse(buffer, callback)
throws an error in this function. At that time I can indeed catch the error, but rejecting the promise properly doesn't help for some reason and the error is still uncatchable:
export function parse(stream: NodeJS.ReadableStream, callback: MessageCallback): Promise<void> { |
Any idea what might be the issue and how to fix it?
Metadata
Metadata
Assignees
Labels
Projects
Milestone
Relationships
Development
Select code repository
Activity
laurent22 commentedon Nov 14, 2021
I think it may be related to this TODO here, as it's try to read as UTF-8 a binary blob:
node-postgres/packages/pg-protocol/src/buffer-reader.ts
Line 6 in 947ccee
And somewhat related is why does it try to read all fields as strings, even when it's a binary blob:
node-postgres/packages/pg-protocol/src/parser.ts
Line 288 in 947ccee
sehrope commentedon Nov 14, 2021
The PostgreSQL wire protocol has two modes for transferring fields: text and binary
The text mode is a string representation of each data type, such as "1234" to represent the number 1234 or "t" to represent the value true. The binary is more compact but is not documented anywhere outside of the server source.
This driver only supports reading the text mode and that class only handles text format responses:
node-postgres/packages/pg-protocol/src/parser.ts
Line 87 in 947ccee
This is still a bug though as throwing uncatchable internal errors is never acceptable. If those long values cannot be deserialized as a string then there should be a length check and a proper error message bubbled up rather than crashing.
Even when this is fixed you likely don't want to ship that much data back and forth in one piece. There's many other options including reading slices of the data as slicing into smaller bytea or using the large object API (https://www.postgresql.org/docs/current/lo-funcs.html).
brianc commentedon Nov 14, 2021
Server: Prevent large data blobs from crashing the application
Joyjk commentedon Jan 31, 2022
laurent22 commentedon Jan 31, 2022
xqin1 commentedon Oct 11, 2023
any updates on this bug fix? With the latest version, there is still uncatchable error with large query result. thanks
alxndrsn commentedon Sep 24, 2024
Just ran into this with v8.8.0. Hopefully there's a way to avoid converting big (> 250MB) binary fields into javascript Strings.
fix(parser): handle exceptions within handlePacket
fix(parser): handle exceptions within handlePacket
haalogen commentedon Jul 4, 2025
Hello! Is there a fix or any workaround solution for this error?
laurent22 commentedon Jul 4, 2025
It looks like someone fixed it on their fork: https://github.com/supabase/node-postgres/pull/10/files
It is not possible to apply the same fix here?
charmander commentedon Jul 4, 2025
That’s #3409.