Workflow running out of memory - garbage collection

Hey Retool team,

I’m running into an issue with a workflow that retrieves historical orders for multiple e-commerce merchants. The challenge isn’t the 139 error from a single block but rather the workflow’s overall memory usage accumulating over execution, eventually hitting the 1GB limit. Some merchants have only a few orders, while others have thousands, leading to inconsistent failures.

Is there any way to clear memory or handle garbage collection in Workflows? In Retool apps, we can use function.reset() to free up memory (at least I assume it would), and something similar here would be incredibly useful—ideally, a way to discard JSON data that’s no longer needed after processing or writing to the database.

Are there any best practices, workarounds, or solutions I should consider? Would love any insights!

Thanks in advance! :rocket: