Uploading large files to S3

Uploading large files to S3 using this approach (retool doc link) has a file size limit of 40MB. Is there a way to get around that?

It also seems like the approach loads the data into memory, which causes a major lag when there are a large number of files. Is there a way to avoid doing that as well?

Hey @ryo!

The 40MB upload limit is fixed for Cloud orgs but you can get around it by sending your request directly from the browser and bypassing the Retool backend. This is slightly less secure since your requests will be coming from a sandboxed iframe and therefore have a null CORS origin.

That being said, if you're looking to upload to S3 you can generate a signed url for uploads using the S3 integration:

From there, you can use any one of the file upload components and the following JS query to upload your file:

const fileBuffer = Uint8Array.from(atob(fileButton1.value[0]), c => c.charCodeAt(0));
const request = new Request(getSignedURL.data.signedUrl, {
  method: "PUT",
  headers: {
    "Content-Type": fileButton1.files[0].type,
  },
  body: fileBuffer
});
return fetch(request);

Alternatively, if you'd also like to avoid going through the Retool frontend you can use a custom component with a file input element to handle the upload. Here's a super bare-bones example of what that could look like:

<input type="file" id="file-input"/>
<button id="upload-button">Upload</button>
<div id="log"></div>

<script type="text/babel">
const fileInput = document.getElementById("file-input");
const uploadButton = document.getElementById("upload-button");
const log = document.getElementById("log");
const reader = new FileReader();
let uploader;
window.Retool.subscribe((model) => uploader = uploadData.bind(null, model.signedUrl));

fileInput.addEventListener("change", () => reader.readAsArrayBuffer(fileInput.files[0]), false); 
uploadButton.addEventListener("click", () => {
  uploader(reader.result, fileInput.files[0].type);
  log.innerHTML = "uploading...";
});

async function uploadData (url, data, type){
  const request = new Request(url, {
    method: "PUT",
    headers: {
      "Content-Type": type,
    },
    body: data,
  });
  const response = await fetch(request);
  log.innerHTML = "done";
}
</script>

With the following as the model for the custom component:

{
  signedUrl: {{getSignedURL.data.signedUrl}},
}

Let me know if that works or if it raises any questions!

1 Like

Thanks @Kabirdas! I will give that a try.

I tried the first method and I am getting the following error:

I am getting this error in Debug Console, no error is showing in DevTools.

Getting the signed URL is working and all of the inputs to the request object seem just fine.

Here is my code:

const fileBuffer = Uint8Array.from(atob(fileDropzone1.value[0]), c => c.charCodeAt(0));

const request = new Request(getSignedURL.data.signedUrl, {
  method: 'PUT',
  headers: {
    'Content-Type': fileDropzone1.files[0].type,
  },
  body: fileBuffer
});
return fetch(request);

The custom component technique worked, but it does not fit in the workflow I need.

Ah thanks for pointing that out!

Retool doesn't have a built-in handler for the Response object returned by fetch so return fetch(request); is functionally the same as await fetch(request); here. Since we're making the request from a null origin it'll likely return an opaque response meaning we can't easily grab the body with something like fetch(request).then(response => response.json()); either...

You should still be able to access other properties on the response though, e.g.

const response = await fetch(request);
return response.status;

should work!

Is there something in particular you're trying to return?

@Kabirdas,

When uploading larger files (115MB in this test case) this technique fails.

I split out that line of code like so to assist in debugging:

    const b = atob(fileDropzone1.value[0])
    let counter = 0
    const fileBuffer = Uint8Array.from(b, function(c, i) {
      if (i % 1048576 === 0) { // Send a heartbeat to the console every megabyte
        counter += 1048576
        console.log("Counter: " + counter)
      }
      return c.charCodeAt(0)
    });

Uint8Array.from() is failing when b is too large - it just hangs here. If you step through it in Dev Tools and hit this line, DevTools crashes (specifically getting a message about losing connection with the browser.)

My current line of experimentation is using set() rather than from() to build the array gradually. I will report back.

I have it narrowed down a bit more. The problem is in the conversion of the Base64 that is in the .value property to a blob that S3 wants. It has to go through a few steps to get there:

  1. Convert Base64 to String
  2. Convert String to byte values
  3. Convert Byte values to blob

There are a few techniques for doing this, and I tried them all, but they are all just too much for the browser to handle on large files.

If there were a native way to go from Base64 to Blob in one step, maybe using a Reader? I however cannot figure out how to get the file into a reader without using a form, which puts me into custom component land which leads to UX compromises.

Just want to make sure this is posted here as well! Awesome job making such a robust workaround @bradlymathews!

@ryo If you use the S3 Uploader component in our component list, and select a connected S3 resource, you should be able to upload a big file with it. I've tested with 200mb file and it works.