Increase file upload size on self hosted, internal server error

Hey all,

Ok, so I'm trying to upload videos to Vimeo, like a few other posts I've seen. Didn't see a solid solution though. I'm on a self hosted install, created with the AWS auto installer.

What is odd is that I get a straight-up "Internal server error" when I upload a file that is too big (~74MB is enough to fail) in the rest query. Works great with small enough files though. In the docker logs I get the following:

{"message":{"type":"UNCAUGHT_ERROR","stack":"PayloadTooLargeError: request entity too large\n    at readStream (/snapshot/retool_development/node_modules/raw-body/index.js:155:17)\n    at getRawBody (/snapshot/retool_development/node_modules/raw-body/index.js:108:12)\n    at read (/snapshot/retool_development/node_modules/body-parser/lib/read.js:77:3)\n    at jsonParser (/snapshot/retool_development/node_modules/body-parser/lib/types/json.js:134:5)\n    at bodyParser (/snapshot/retool_development/node_modules/body-parser/index.js:109:5)\n    at /snapshot/retool_development/node_modules/dd-trace/packages/datadog-instrumentations/src/router.js\n    at bodyParser (/snapshot/retool_development/node_modules/dd-trace/packages/datadog-shimmer/src/shimmer.js)\n    at Layer.handle [as handle_request] (/snapshot/retool_development/node_modules/express/lib/router/layer.js:95:5)\n    at trim_prefix (/snapshot/retool_development/node_modules/express/lib/router/index.js:317:13)\n    at /snapshot/retool_development/node_modules/express/lib/router/index.js:284:7\n    at Function.process_params (/snapshot/retool_development/node_modules/express/lib/router/index.js:335:12)\n    at next (/snapshot/retool_development/node_modules/express/lib/router/index.js:275:10)\n    at compression (/snapshot/retool_development/node_modules/compression/index.js:220:5)\n    at /snapshot/retool_development/node_modules/dd-trace/packages/datadog-instrumentations/src/router.js\n    at compression (/snapshot/retool_development/node_modules/dd-trace/packages/datadog-shimmer/src/shimmer.js)\n    at Layer.handle [as handle_request] (/snapshot/retool_development/node_modules/express/lib/router/layer.js:95:5)\n    at trim_prefix (/snapshot/retool_development/node_modules/express/lib/router/index.js:317:13)\n    at /snapshot/retool_development/node_modules/express/lib/router/index.js:284:7\n    at Function.process_params (/snapshot/retool_development/node_modules/express/lib/router/index.js:335:12)\n    at next (/snapshot/retool_development/node_modules/express/lib/router/index.js:275:10)\n    at /snapshot/retool_development/backend/transpiled/server/modules/loggerMiddleware.js\n    at /snapshot/retool_development/node_modules/dd-trace/packages/datadog-instrumentations/src/router.js"},"level":"error","timestamp":"2022-09-25T02:21:11.950Z"}

I've added CLIENT_MAX_BODY_SIZE: 5000M (I know that's huge, hoping to manage some big video files...) to the https-portal section in the docker-compose.yml. I've removed and restarted the compose file, but I still get the internal server error. I have also configured the https-portal section and the env file to use my subdomain, that's not working either and I wonder if that's related somehow?

Also, I'd get a complete crash when uploading a too big file until I added a swap disk. Would have to reboot the server.

Any help would be awesome! I assume I'm missing something?

Hi @Drew!

At the moment there is a hard-coded 50MB limit to requests sent through the backend that our dev team is looking into, I can let you know here when we have an update. The only way around this I'm aware of at the moment is to make the request directly from your browser so it avoids the backend entirely.

@Kabirdas Ah, gotcha. Thought I was going crazy. Yeah, I guess I'll take a look at a custom iframe solution in the meantime. Thanks!

1 Like

@Kabirdas Is there a way to work around this hard-coded limit now ?

Hey @Angio, so far there hasn't been an update on the issue but I've bumped it so that it can be revisited. What is your particular use case?

@Kabirdas I'm using an API call to upload files via Retool to a Azure blob storage. It works well but when the file was too large I was getting and error 413. So I change the CLIENT_MAX_BODY_SIZE to fix it. Error 413 is gone but now I'm getting an Internal Server error message when uploading large files (~50 MB). From the logs I get the "PayloadTooLargeError: request entity too large" error message. My guess is that it,s comming from the node.js/nginx.conf as we can change the CLIENT_MAX_BODY_SIZE for both location and server but not http. I guess that's the hard-coded limit from the back-end you were talking about ?

+1 looking to change upload size of retooldb self hosted it’s possible? @Tess @PeteTheHeat @Kabirdas ?

Hey folks! Apologies for such a late reply.

@Angio, if you're looking to upload larger files to Azure blob storage then I'd be curious if something like this could work for you. It resembles the signed URLs that are used for S3 and GCS which allow uploading to happen directly from the browser.

Also curious to know what your use case is @agaitan026. You should be able to access your Retool DB from outside of Retool as well, especially if you've externalized it. Meaning you can run larger infrequent operations directly on the database itself. This wouldn't be the case if you're expecting users to regularly uploading large files into your Postgres DB though :thinking:

1 Like

Yea i just wanted to upload my tables csvs directly without using external gui but thats ok i just used one of those tool and thats it. But i think 3mb is very small

@Kabirdas I'm already using a SAS ressource to upload the files to the blob storage and it's working great. The problem is when uploading files via http or https protocol, there is a limit size for a payload. It's defined with the CLIENT_MAX_BODY_SIZE variable in nginx.conf for server and location blocks but I think the http block is not changeable, therefore limiting the size to something around 50 MB. Am I right ?

Are you making the request via a REST query or with something like fetch in a JavaScript query? Typically I'd recommend using a REST query but that would cause the request to pass through the Retool backend which would cause it to hit the 50MB restriction. Using JS lets the request happen directly from a sandboxed environment in the browser which can be a workaround, Bradly has a detailed post that implements this strategy here. It's no longer needed for S3 and GCS but I imagine it can still work for Azure.