Workflow loop with large data sets

Still getting to know Retool but loving it so far.

As I've gotten into Workflows for ETL management, I've run into an issue where, when updating databases, the loop cache is too big for memory.

Ideally, I would like to get a list of values (variables) from my Retool Database, then fetch, transform, and save data related to a single value. In other words, one full workflow operation for one item from a static list, then continue to the next, and the next, etc. The goal is to avoid creating a gigantic cache in the loop step.

Is there a way to construct a workflow that will loop through a list, then create resulting REST calls, but do so individually and not create a massive cache in the middle of the workflow?

Thank you for your advice!

Hey @allenfuller!

It's possible to create a Workflow block so that a separate Workflow is triggered:

There's also an internal feature request for adding more options to loop blocks but for now, you can await each request before firing the next one by switching the code in the loop block to:

const results = [];
for (const [index, value] of iterableSource.data.entries()) {
  const result = await yourQuery_lambda.trigger();
  results.push(result)
}
return results;

From this:

To this:

Does that sound like it could work for your use case?