Skip to content

pg-protocol can throw an uncatchable error "Cannot create a string longer than 0x1fffffe8 characters" when reading large data #2653

@laurent22

Description

@laurent22

Using the "pg" package 8.5.1

To replicate the issue, save some large data as a blob into the database. In my case 340 MB was enough to trigger the bug. Then try to read back the data.

It will throw this uncatchable error that crashes the application:

Error: Cannot create a string longer than 0x1fffffe8 characters
    at Buffer.utf8Slice (<anonymous>)
    at Object.slice (node:buffer:593:37)
    at Buffer.toString (node:buffer:811:14)
    at BufferReader.string (/home/joplin/packages/server/node_modules/pg-protocol/src/buffer-reader.ts:35:32)
    at Parser.parseDataRowMessage (/home/joplin/packages/server/node_modules/pg-protocol/src/parser.ts:274:51)
    at Parser.handlePacket (/home/joplin/packages/server/node_modules/pg-protocol/src/parser.ts:172:21)
    at Parser.parse (/home/joplin/packages/server/node_modules/pg-protocol/src/parser.ts:101:30)
    at Socket.<anonymous> (/home/joplin/packages/server/node_modules/pg-protocol/src/index.ts:7:48)
    at Socket.emit (node:events:390:28)
    at addChunk (node:internal/streams/readable:315:12)

As a test I've tried to make it reject the promise when parser.parse(buffer, callback) throws an error in this function. At that time I can indeed catch the error, but rejecting the promise properly doesn't help for some reason and the error is still uncatchable:

export function parse(stream: NodeJS.ReadableStream, callback: MessageCallback): Promise<void> {

Any idea what might be the issue and how to fix it?

Activity

laurent22

laurent22 commented on Nov 14, 2021

@laurent22
Author

I think it may be related to this TODO here, as it's try to read as UTF-8 a binary blob:

// TODO(bmc): support non-utf8 encoding?

And somewhat related is why does it try to read all fields as strings, even when it's a binary blob:

fields[i] = len === -1 ? null : this.reader.string(len)

sehrope

sehrope commented on Nov 14, 2021

@sehrope
Contributor

And somewhat related is why does it try to read all fields as strings, even when it's a binary blob:

The PostgreSQL wire protocol has two modes for transferring fields: text and binary

The text mode is a string representation of each data type, such as "1234" to represent the number 1234 or "t" to represent the value true. The binary is more compact but is not documented anywhere outside of the server source.

This driver only supports reading the text mode and that class only handles text format responses:

throw new Error('Binary mode not supported yet')

Any idea what might be the issue and how to fix it?

This is still a bug though as throwing uncatchable internal errors is never acceptable. If those long values cannot be deserialized as a string then there should be a length check and a proper error message bubbled up rather than crashing.

Even when this is fixed you likely don't want to ship that much data back and forth in one piece. There's many other options including reading slices of the data as slicing into smaller bytea or using the large object API (https://www.postgresql.org/docs/current/lo-funcs.html).

brianc

brianc commented on Nov 14, 2021

@brianc
Owner
Joyjk

Joyjk commented on Jan 31, 2022

@Joyjk
laurent22

laurent22 commented on Jan 31, 2022

@laurent22
Author
xqin1

xqin1 commented on Oct 11, 2023

@xqin1

any updates on this bug fix? With the latest version, there is still uncatchable error with large query result. thanks

alxndrsn

alxndrsn commented on Sep 24, 2024

@alxndrsn
Contributor

Just ran into this with v8.8.0. Hopefully there's a way to avoid converting big (> 250MB) binary fields into javascript Strings.

added 2 commits that reference this issue on Mar 27, 2025
971ce68
c2a168f
haalogen

haalogen commented on Jul 4, 2025

@haalogen

Hello! Is there a fix or any workaround solution for this error?

laurent22

laurent22 commented on Jul 4, 2025

@laurent22
Author

It looks like someone fixed it on their fork: https://github.com/supabase/node-postgres/pull/10/files

It is not possible to apply the same fix here?

charmander

charmander commented on Jul 4, 2025

@charmander
Collaborator

It looks like someone fixed it on their fork: https://github.com/supabase/node-postgres/pull/10/files

It is not possible to apply the same fix here?

That’s #3409.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      Participants

      @brianc@alxndrsn@laurent22@xqin1@sehrope

      Issue actions

        pg-protocol can throw an uncatchable error "Cannot create a string longer than 0x1fffffe8 characters" when reading large data · Issue #2653 · brianc/node-postgres