I’m running into an issue with a workflow that retrieves historical orders for multiple e-commerce merchants. The challenge isn’t the 139 error from a single block but rather the workflow’s overall memory usage accumulating over execution, eventually hitting the 1GB limit. Some merchants have only a few orders, while others have thousands, leading to inconsistent failures.
Is there any way to clear memory or handle garbage collection in Workflows? In Retool apps, we can use function.reset() to free up memory (at least I assume it would), and something similar here would be incredibly useful—ideally, a way to discard JSON data that’s no longer needed after processing or writing to the database.
Are there any best practices, workarounds, or solutions I should consider? Would love any insights!
Great question, I would definitely like to hear from other members in the community about their strategies and best practices for managing memory for large amounts of data in workflows.
Along with calling sub-workflows for tasks inside of larger workflows, as these will execute in a separate sandboxed environment with more memory available.
To clarify on your specific question about freeing up memory inside of a workflow before it runs once the data in no longer needed, I can definitely make a feature request for that!
Also for self-hosted users, there are a lot more options for controlling query/block run times and data limits as well which can go higher than what we allow for cloud users!