Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AWS non-retryable streaming request error #6770

Open
3 of 4 tasks
gauthampkrishnan opened this issue Jan 2, 2025 · 8 comments
Open
3 of 4 tasks

AWS non-retryable streaming request error #6770

gauthampkrishnan opened this issue Jan 2, 2025 · 8 comments
Assignees
Labels
bug This issue is a bug. p3 This is a minor priority issue

Comments

@gauthampkrishnan
Copy link

Checkboxes for prior research

Describe the bug

I am trying to upload a file using aws putObject but with body as createReadStream and using retryMode as standard but on retry it throw error non-retryable streaming request error.

Regression Issue

  • Select this option if this issue appears to be a regression.

SDK version number

@aws-sdk/package-name@version, ...

Which JavaScript Runtime is this issue in?

Node.js

Details of the browser/Node.js/ReactNative version

18.20.4

Reproduction Steps

try to upload a file using body as stream and try to replicate a failure while uploading file

Observed Behavior

non-retryable streaming request error

Expected Behavior

I think it shouldnt throw error and it should retry

Possible Solution

No response

Additional Information/Context

No response

@gauthampkrishnan gauthampkrishnan added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels Jan 2, 2025
@kuhe
Copy link
Contributor

kuhe commented Jan 8, 2025

That is normal. The request reads the stream but cannot rewind it if it fails.

@gauthampkrishnan
Copy link
Author

gauthampkrishnan commented Jan 8, 2025

So what is the solution to do retries while using streams as body ? are there any solution for this scenario ?

@gauthampkrishnan
Copy link
Author

gauthampkrishnan commented Jan 8, 2025

Would using Upload() method from lib-storage helps in this case ?https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/Package/-aws-sdk-lib-storage/ as i am guessing this would buffer the parts and buffered parts would be retried depending on the client retry config?

@kuhe
Copy link
Contributor

kuhe commented Jan 9, 2025

You can buffer the stream or

let attempts = 3;
let stream = ... ;
while (--attempts) {
  try {
    await s3.putObject({ Body: stream });
  break; 
  } catch (e) {
    stream = ... ; // rewind/reacquire
  }
}

@aBurmeseDev aBurmeseDev self-assigned this Jan 9, 2025
@aBurmeseDev aBurmeseDev added p3 This is a minor priority issue and removed needs-triage This issue or PR still needs to be triaged. labels Jan 9, 2025
@aBurmeseDev
Copy link
Member

@gauthampkrishnan - Can you share the code you are working on so that we can better assist you? Here's docs on SDK retry strategies and behavior for your reference: https://github.com/aws/aws-sdk-js-v3/blob/main/supplemental-docs/CLIENTS.md#retry-strategy-retrystrategy-retrymode-maxattempts

@aBurmeseDev aBurmeseDev added the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 9, 2025
@gauthampkrishnan
Copy link
Author

gauthampkrishnan commented Jan 9, 2025

@aBurmeseDev the code is simple, example

const stream = createReadStream(filePath)
try {
    await s3.putObject({ Body: stream });
  } catch (e) {
   console.log(e)
  }

s3 uses nodehttphandler with retry settings

and i am using the nodeHttpHandler by passing in maxRetries so it retries for buffer, but for stream it will throw error but based on above conversation looks like that is expected and need to implement a custom function with error handler inorder to retry for body which uses stream. I think the Upload() function from library lib-storage does the same under the hood it reads stream and creates buffered parts and I also believe the buffered parts would be retried on fail so I guess I will use same and maybe this issue can be closed.

@github-actions github-actions bot removed the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 10, 2025
@aBurmeseDev
Copy link
Member

In definition, Upload() allows for efficient uploading of buffers, blobs, or streams, using a configurable amount of concurrency to perform multipart uploads where possible. This enables uploading large files or streams of unknown size due to the use of multipart uploads under the hood. Here's some code examples for Upload() if you might find it helpful: https://github.com/aws/aws-sdk-js-v3/tree/main/lib/lib-storage/example-code

Let us know if you need further support.

@aBurmeseDev aBurmeseDev added the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 13, 2025
@gauthampkrishnan
Copy link
Author

Ya so my doubt is if I pass in a retry config with maxRetries to s3 client and try to use the Upload() https://github.com/aws/aws-sdk-js-v3/tree/main/lib/lib-storage/example-code and as its using multipart file upload under the hood will it do the retry of parts if it fails in between? for example lets say i am using stream and upload() function so upload() will divide the data into parts and uploads to bucket lets say one of the part failed in that case would it retry uploading that part?

@github-actions github-actions bot removed the response-requested Waiting on additional info and feedback. Will move to \"closing-soon\" in 7 days. label Jan 15, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug This issue is a bug. p3 This is a minor priority issue
Projects
None yet
Development

No branches or pull requests

3 participants