Feature request: workflow "queues"

The nature of many of my workflows is to process lists of things - there are often a few loop steps to accomplish this. But oftentimes, a user will be awaiting feedback from the workflow.

Having prepared my list, I'll then go through multiple loop steps, in which I will:

  • transform data
  • enrich
  • call an external API
  • save the results
  • prepare a response

Some of these steps involve a bit of latency however - notably calling APIs, storage, document processing, etc. When these are in loops, the whole loop step itself can take some time to complete before moving on to the next workflow step, even though there are some loop items that are already processed and ready for the next step.

Let's say I am half way through a list of 50 API calls in a loop step. So 25 are complete. The subsequent step is to trigger some kind of event in user-space - some kind of feedback to a user. That feedback cannot occur until all 50 of the requests have completed, even though it makes sense to provide feedback after just one has completed.

Here's a concrete example: printing 50 asset tags.
The request comes into the workflow: "generate 50 tags and send them to a remote printer"

  • insert 50 tag records into the database
  • extract those 50 tags
  • loop through and enrich them
  • loop through and convert to the printer's format
  • loop through and send each tag to the printer
  • loop through and mark all 50 as printed
  • return a "completed" response

A FIFO queue comes to mind as a way to accomplish what I'm after.

Rather than waiting for all 50 tags to be sent to the printer, it would be fine to send 1 tag to the printer as it was made available, rather than waiting for all 50 to be prepared.

Using a queue this might look like:

  • insert 50 tag records into the database
  • extract those 50 tags
  • enqueue 50 tags and process each separately
    • enrich tag
    • convert to the printer's format
    • send tag to the printer
    • mark tag as printed
    • publish on websocket
  • return a "completed" response

Probably the most Retool-ey way of doing this would be to run other workflows in a loop step, with some parallelisation. That would achieve much the same thing. But that means a single workflow run (in this use case) becomes 51 workflow runs.
If workflows run from other workflows didn't count to your workflow run limit, this would be an excellent solution.

Hey @hansontools! This is an interesting idea and I think there's definitely a variety of use cases for enqueuing certain operations. I can think of some workarounds - asynchronicity, namely - but the big question mark is error handling. I'll bring it up with the team!

After some tinkering, I think I've come up with something that kind of works. It's a little bit convoluted and is very code-heavy, but takes advantage of asynchronicity and the "Functions" feature, in particular. These are inherently asynchronous, which makes them particularly well-suited for this application.

I've defined the following function just to simulate latency, but this can be any resource query. As written, it will take 5s to run to completion.

I can then execute this function from within a code block and, as long as I leave out the await keyword, it will execute asynchronously.

This means that the workflow moves on to the response block while allowing those longer processes to finish. :+1: It's not perfect but it seems to work!

How would you go about using the data provided by the async function if it were to return some (like an API call)?
Throwing an array of function calls in an await Promise.all() should work, no?

You could make a safe wrapper for the calls? Something like this?

function safeFetch(url) {
  const retVal = { error: null, data: undefined };
  try {
    const res = await fetch(url);
    retVal.data = res;
  } catch (e) {
    retVal.error = e;
  }
  return retVal;
}

This way you will always know the shape of the return and not be surprised by any errors getting thrown.

1 Like

If you want to handle the response of an asynchronous function call, your best bet is to use .then syntax for chaining on subsequent functions!