Unable to upload large files without running out of memory

My goal: using file input and rest api query ressource i select ad upload large files

Hi Retool Support,

I’m trying to upload large files (500 MB) from a Retool app UI to my backend REST API .However, all current approaches lead to browser out-of-memory.

Here’s what I’ve tried:

File Input component + Retool query → Retool base64-encodes the file, which explodes memory usage.

HTML component with fetch() → Works for small files, but fails with large ones due to CORS and memory buffering.

Goal: I need a supported Retool-native or recommended method to stream large file uploads directly from the user’s browser to my backend or cloud storage (Azure Blob), without loading the entire file into memory.

Issue: when i select large file i run into out of memory

Additional info: (Cloud or Self-hosted, Screenshots) : Self-hosted

Hey @Adel_LAJIL

So the issue you’re running into is that there’s a 40mb max upload for files that pass through retool. You could use S3 as your resource because it uses signed URLs making the upload occur through your browser.

hope that helps!

Do not use the File Input for large files. Instead:

  1. Let users get a temporary upload URL (presigned URL) from your backend or cloud storage (like Azure Blob).

  2. Use JavaScript’s fetch() method to upload the file directly from the browser to the presigned URL, which streams the file and doesn’t use lots of memory.

  3. This avoids loading the whole file into Retool or converting to base64.

1 Like