I am experimenting with Retool Workflow to handle some ETL tasks.
When using the « Python Code Block » I encounter an error « Ran out of memory while returning results. Only 256mb total memory allowed, including data returned » (I attached a screenshot).
This task is responsible to download a 50mb file (which is far less than 256mb) and moreover, in your online documentation, it says that workflows have a memory of 1Gb (I also attached a screenshot).
Could you highlight me how to resolve this error ?
Thanks for flagging this inconsistency. I checked with the team and there is currently an undocumented, block-level limit of 256mb enforced specifically when executing Python code. We'll be updating the public docs accordingly!
Even then, though, it's not immediately clear why you are hitting this limit as there is little to no visibility into the block's memory usage. You can see the entire workflow's network usage in the run history, which provides some insight, but there is also an ambiguous baseline memory cost associated with executing code in a Python environment.
The sum of those two things is what's exceeding the 256mb limit, in this case. Any information you can share about the contents of the code block, including the data you're fetching and any imported libraries, would be helpful in figuring out whether this behavior is expected or not!
For the actual code, we have not imported any libraries (only using requests which is already present). After that, this block runs an HTTP request and return its content (We cannot use a Retool's Resource for this API Rest because of some weird authentication mechanism that even "Custom Auth" cannot handle).
The response is pretty heavy, between 1 MB and 5 MB of JSON data. However, 5 MB response is far less than 256 MB, so I expected it to work.
When I do the same thing using a JavaScript block (using node-fetch library), it works, it can handle the 5 MB response.
Could you highlight me on the response size limit for Python and JavaScript block ? It would help us determine if we can use Retool for some basic ETL processing.
Yes - I can confirm that individual JS code blocks share the same 1GB memory limit as the entire workflow. I'm not sure why there is a more strict limitation enforced for Python blocks.
In this case, I'm pretty sure you're not seeing the total network cost because the workflow failed to run to completion. Looking at the contents of your block, though, I'm even more mystified as to how you're hitting the memory limit. If it makes sense for your use case, I'd probably recommend using JS blocks for making manual API requests.
To your last point, there isn't a separate limit to the size of the payload returned by a given block. It would just need to respect any block-level limitations - 256MB for Python and 1024 MB for JS - and the 1GB memory ceiling for the entire workflow run. The other thing to keep in mind is that the body of individual resource requests are generally limited to ~100MB. I'm not 100% sure if this would apply to requests made from within a code block but I'm fairly sure it would.
It's worth noting that all the above limits can be adjusted or removed entirely if you are running a self-hosted instance!