Rest API File Upload "Result exceeded maximum size of 100 MB" when file is only 30MB

I am using Rest API resource to upload a video to Cloudflare. I have been able to upload files up to 25MB in size, but 30MB files cause the error Result exceeded maximum size of 100 MB. I see the video does upload successfully to Cloudflare, but I'm not able to get the response object with the associated metadata because of this error.

Screenshot below, any help is appreciated, thanks in advance!

Hey @kevinpearl!

Which Cloudflare endpoint are you hitting? I'm curious what exactly the response looks like, can you share an example based on the smaller uploads that run successfully? Is the response size for those multiple times greater than the request?

Hi @Kabirdas, thanks for the response!

I'm hitting the basic uploads endpoint documented here.<ACCOUNT_ID>/stream

The response is just metadata and doesn't depend on the size of the upload. Here's the response from a 23MB file I just uploaded -

  "result": {
    "uid": "78436e482351466cbfe58bb2648389b8",
    "creator": null,
    "thumbnail": "",
    "thumbnailTimestampPct": 0,
    "readyToStream": false,
    "status": {
      "state": "queued",
      "errorReasonCode": "",
      "errorReasonText": ""
    "meta": {
      "name": "2022-10-07_2943966159781615469.mp4"
    "created": "2022-10-14T20:46:51.958943Z",
    "modified": "2022-10-14T20:46:51.958943Z",
    "size": 23359811,
    "preview": "",
    "allowedOrigins": [],
    "requireSignedURLs": false,
    "uploaded": "2022-10-14T20:46:51.958935Z",
    "uploadExpiry": null,
    "maxSizeBytes": null,
    "maxDurationSeconds": null,
    "duration": -1,
    "input": {
      "width": -1,
      "height": -1
    "playback": {
      "hls": "",
      "dash": ""
    "watermark": null,
    "clippedFrom": null,
    "publicDetails": null
  "success": true,
  "errors": [],
  "messages": []

And a screenshot -

Hey @kevinpearl!

In conversation with our engineers about this and will let you know here when there's a fix or any further updates! Apologies for the blocker here :pensive:

Hey @kevinpearl!

We've pushed a potential fix for this issue, can you let me know if it's something you're still running into?

Hi @Kabirdas im having the same issue but with an s3 file :frowning: is there any way i can fix this? thank you! (the file is 87.4 MB)

Good to know! Will bring this back up with engineering to take another look and continue to pass along updates here.

Thanks for the update @mfroca_vita!

1 Like

Hey @mfroca_vita!

After doing some digging it looks like this may be a separate issue, could you try finding the request in the network tab of your browser console and post a screenshot of what you're seeing in the "Preview" of the request with all the fields expanded?

I'm also curious to know if you're seeing the file get uploaded to S3, or if it's not going through at all.

Hi there,

I'm experiencing the same error message. In my situation I'm trying to receive a file from Google Cloud Storage (the file is ~50MB). This request used to work ~10 days ago and I only noticed today that it's not working anymore. I haven't changed or accessed retool in that time.



    "error": true,
    "message": "Result exceeded maximum size of 100 MB",
    "queryExecutionMetadata": {
        "estimatedResponseSizeBytes": 65,
        "resourceTimeTakenMs": 1710,
        "isPreview": true,
        "resourceType": "gcs",
        "lastReceivedFromResourceAt": 1671263328232
    "source": "resource"

Here some more context for PM/devs
I used to load the same amount of data from bigquery which worked. Overall it was slower than the file retrieval. In my app I'm doing filtering and would have to run the query again. To save roundtrip time and data charges I'm doing the filtering on the client side by using javascript to iterate over the file. This is fast and works well for easy filtering. In an ideal world I can compress the data on the cloud storage side, download it in the browser and uncompress it there. That should further reduce 50MB to 3MB which speeds up the download even further. Only issue was that I didn't find an uncompress library for gzip or zip in javascript that I could use.

@Kabirdas did something change in the backend that this stopped working (see above post)?

Hey @alien, sorry about the late reply but thanks for flagging this!

Did some digging with the team and as for 2.106.0 (which released late Tuesday of last week) we started more strictly enforcing the 100MB response limit for queries in order to help improve the stability of our backend resource connectors.

The encoding of the file can in some case increase the response size, but the fact that the file is only 50MB and you're seeing an estimatedResponseSizeBytes of only 65 does seem a bit odd though.

Are you using the "Read a file from GCS" action or the "Generate a signed url" action to get your file?

Facing the same issue, I'm trying to pull user data from MongoDB resources.

message:"Result exceeded maximum size of 100 MB"

Thanks for the added context @naveenrajv!

We were able to reproduce the behavior with GCS and the team is looking into a fix to be published after the holiday which we can update you all on here. Do you know the approximate size of the return value on your MongoDB query? Was it similarly large (e.g. ~50MB)?

1 Like

Yes, I'm using the "Read a file from GCS" option.

Hi @Kabirdas, We are looking for an update on this issue.

Hi @naveenrajv!

This should be fixed as of version 2.107.1, can you let me know if you're still seeing it on your end?

Hi @Kabirdas, having the same issue on 2.108.0 also!!!

Thanks for letting me know @naveenrajv! Do you know the size of the data you're trying to fetch?

I trying to fetch 15k+ user profile information from MongoDB. I'm not sure about the exact weightage of the data.