Requests in Retool aren't made directly from your browser, instead, they're made from the backend of your application. You can see a high-level overview of what a typical Retool deployment might look like:
When you make a request, it first travels to the DB Connector container, which makes the actual query to your resource, and then the query result travels back to your browser (the data itself isn't stored in the backend unless you choose to cache your query). The primary difference between Cloud and Self-Hosted is that Self-Hosted customers host everything within that larger "Retool" box themselves and can choose how they want to do load balancing etc. Cloud customers, on the other hand, have it hosted by us and so multiple different orgs might be sending their requests through the same DB Connector container.
That all being said when you first use a file picker component to select a file that should load the file's data into the browser's memory. There's no strict limit there to what can be uploaded but there is a soft limit at the moment in that if you actually try and manipulate a large chunk of data within your browser it could drastically slow things down. The strict limit comes the moment you try and send that data off to your resource through Retool's backend.
This workaround was made in order to send requests directly from the browser to avoid going through Retool's backend and therefore also avoid hard limits to file uploads, it also uses custom components to avoid overloading Retool's frontend with data. The workarounds for uploading large files are rather complex though and don't offer much of the support that Retool does through its integrations so it's definitely useful for the devs to know that more people are trying to do this.