Workflow running out of memory - garbage collection

Hey Retool team,

I’m running into an issue with a workflow that retrieves historical orders for multiple e-commerce merchants. The challenge isn’t the 139 error from a single block but rather the workflow’s overall memory usage accumulating over execution, eventually hitting the 1GB limit. Some merchants have only a few orders, while others have thousands, leading to inconsistent failures.

Is there any way to clear memory or handle garbage collection in Workflows? In Retool apps, we can use function.reset() to free up memory (at least I assume it would), and something similar here would be incredibly useful—ideally, a way to discard JSON data that’s no longer needed after processing or writing to the database.

Are there any best practices, workarounds, or solutions I should consider? Would love any insights!

Thanks in advance! :rocket:

Hello @iqm0!

Great question, I would definitely like to hear from other members in the community about their strategies and best practices for managing memory for large amounts of data in workflows.

Our common suggestion is to use batching for queries, so that not all the data is loaded at once. By breaking up the data into smaller amounts using loops, query optimization and query pagination you can reduce the overhead on the memory usage. We have a page on our docs on workflow best practices for greater detail on these.

Along with calling sub-workflows for tasks inside of larger workflows, as these will execute in a separate sandboxed environment with more memory available.

To clarify on your specific question about freeing up memory inside of a workflow before it runs once the data in no longer needed, I can definitely make a feature request for that!

Also for self-hosted users, there are a lot more options for controlling query/block run times and data limits as well which can go higher than what we allow for cloud users!

Check out the docs here for those limits and the differences between self-hosted and cloud.