S3 download file payload limit of 100mb

I have just built an app where an internal team can download files from S3 (one at a time). Some work some come back with the 100MB limit error, this has just gone live for a campaign and I need this functionality urgently, is there a way to increase size to over 100mb?

The files themselves are about 72 mb with some other meta data must be pushing them over 100mb

I have also tried generating temp URLS for the get, I get access denied, I think it must be permissions, but I don know where to look I don't want the bucket accessible to the public can this be achieved?

in the image my pet_media_url is the S3 TAG name (file name) not a actually URL as a FYI

Hi @Guy_N thanks for reaching out!

Unfortunately, the 100mb data limit on Retool Cloud isn't configurable :disappointed:

For the access denied error on the generate signed url query, this Stackoverflow post has some ideas that may be helpful for a private bucket. Otherwise, it would be helpful to take a look at your specific configuration if you can share screenshots

We also have an internal request in our queue to improve our s3 integration for handling large files. I don't know what the timeline is yet, but I'll let the team know that you're also looking for improvements here. I'll reply to this ticket if I get an updates