Sorry about the late reply here. Does the query work if you select fewer rows?
This might give a good way to estimate the response size for all 15k+ by checking the estimatedResponseSizeBytes of the smaller query.
If the response size itself happens to actually be over 100MB you might also try using server-side pagination or a JS query that batches your requests and then returns them all together (with a pattern like this).
Let me know what you find and we can go from there!
Did this get a resolution, I have just built an app where an internal team can download file from S3 (one at a time). Some work some come back with the 100MB limit error, this has just gone live for a campaign and I need this functionality urgently, is there a way to increase size to over 100mb?
Right now the 100MB limit is intentional, it seemed like there was an issue specific to GCS where files were being estimated as being larger than they actually were but that should be fixed now.
What are the size of the files you're retrieving from S3? If they are exceeding 100MB you might try downloading it directly from your browser with a fetch request in a JavaScript query.
Hmm... if you use a "Read a file from S3" Action type with that same key do you also get an Access Denied error?
It may be good context to see more about how you've configured authentication on your S3 context as well if the correct permissions aren't getting passed.