I have a workflow that has a loop block that loops over 10,000 records, and in the block calls a multi-step function making an API call and database save. When I run for a 100 records I get no issues, over for the full number of records it fails on the loop. If I specifically just use one record, e.g. one it failed on, the workflow runs correctly.
This suggests that at some point the loop has issues when there are too many records. Is there a guide on how many it can cope with?
Yes, youβre probably running up against a Workflow limit. According to the official docs, a synchronous workflow run (or the synchronous portion up to the first webhook-response block) has a 15-minute timeout.
If you want to share a few screenshots of your workflow configuration and recent run history I can help pinpoint what limit you hit (timeout, memory, concurrency, etc.).
There are a number of common techniques to stay within those limits while processing large sets of records. For instance: batching/chunking data, splitting the job across multiple workflows, or scheduling to process in smaller increments (e.g. run a subset now, then resume later).
Hi @klautier ,
Retool have problem with running large chunks of datapoint at once insted loading all of them at once you can try batch processing.
Split the Dataset into small batches (e.g., 200 each)
Loop over each batch
Inside the loop, process only those 200 items
Workflow finishes safely without timeout/memory issues
Thanks for reaching out on the Retool community with your question. I believe both Shawn and Vishal have some good suggestions to what could be the problem and some ways to mitigate this
If you would like to explain a bit more about what the Workflows whole goal is, what these 10k records are, some details on the error you are getting in the UI, and how this is all configured, we can try and troubleshoot further what else could be wrong