S3 download file payload limit of 100mb

I have just built an app where an internal team can download files from S3 (one at a time). Some work some come back with the 100MB limit error, this has just gone live for a campaign and I need this functionality urgently, is there a way to increase size to over 100mb?

The files themselves are about 72 mb with some other meta data must be pushing them over 100mb

I have also tried generating temp URLS for the get, I get access denied, I think it must be permissions, but I don know where to look I don't want the bucket accessible to the public can this be achieved?

in the image my pet_media_url is the S3 TAG name (file name) not a actually URL as a FYI

Hi @Guy_N thanks for reaching out!

Retool Cloud has a query return size limit of 100mb, which isn't configurable, but not sure that is the issue here :thinking: :disappointed: For the access denied error on the generate signed url query, this Stackoverflow post has some ideas that may be helpful for a private bucket. Otherwise, it would be helpful to take a look at your specific configuration if you can share screenshots

We also have an internal request in our queue to improve our s3 integration for handling large files. I don't know what the timeline is yet, but I'll let the team know that you're also looking for improvements here. I'll reply to this ticket if I get an updates

Has this been addressed? Im running into the same issue. as the OP stated, some of the files arent even 100mb

For the larger uploads, im having to download thru DO spaces directly

Hi @macstrat apologies for the delay here! Could you share the screenshot of your query & the exact error message?

Any chance you can also share one of the files that is hitting this error?

1 Like