Is there some way that within a workflow, we could modify its own triggers?
Context: data comes in bursts into my database. When it comes, I may need to process quickly multiple entries, thus running the workflow multiple times (loop is a pain that I am choosing not to investigate yet) so I want to trigger it every minute.
However, when things are calm, checking every minute is a waste of resources.
Is there some way to have a thing where
1 if there is data to process (I know how to check), run the workflow every minute until there is no more pending and go to option 2
2 if there is no data to process, run the workflow every hour to check for data and go to option 1
The reason is to limit the number of runs of the workflow when it's not needed, to save on the quota of bandwidth/runs available to our plan - and save some trees while we're at it, by reducing our energy footprint
Basically is there some workflow block that I can use to alter the trigger of the workflow?
I have something similar where webhooks call my workflow and most of the time it sends a single item, but sometimes it's an array of data. To handle it, I first check if it's an array (or an array.length > 1) and if it is, I do use a loop to call the workflow endpoint repeatedly for each item in the array, 1 at a time.
In the webhook itself, that initial check on array.length is always === 1 for the looped calls, so that branch processes everything as normal.
If your data is just going into a DB, I don't think there is any way to have the DB trigger your workflow based on a row being added, but maybe your DB provider has a feature like that.
Now, if you don't care about how quickly your workflow runs after the data is run, then you could take the same approach mentioned above, run it every hour with a flow like:
trigger -> hourly
check db for rows
-> yes -> grab the first row -> process -> clear row -> call workflow again
-> no -> do nothing (don't call the workflow again)
```
Then your workflow would be triggered by the hourly CRON job, and just keep calling itself until it runs out of rows to process.
Hi @DocShades, there is no built-in way to achieve this but if you have a way to check if there is data to process, what about running the workflow every hour and using a recursive workflow to achieve this?
Just what I was looking for. Using a workflow to run a workflow could work. But it would be nice to have something in the trigger menu. A cronjob+.
I would check it every 5 minutes or so. It would be nice to be able to do a code block / query to check the database or something else:
SELECT COUNT(*) AS counter FROM records
WHERE processed = false
And then {{ trigger_block.value.counter > 0 }} to activate the workflow. To keep workflow runs low. Since it's only 5k we get. And 75 EUR extra for 5001 runs is silly.
I select the first record from a database table with jobs
Multiple workflow blocks follow.
Then I mark the job as processed = true with an sql update query.
After that I retrieve the next job in the next block in the same workflow run. Something like select * from jobs where processed = false LIMIT 1;
I use a branch to see if the result of that block has actually a record. If not.. the workflow will run in an hour again.
If it does have a record, I first compare the primary id of this record with the record from step 2. I don’t want an endless loop due to some reason the job can’t be processed. If they are not the same it can go ahead.
Then run a workflow block and select the current workflow.
No it's not. Sorry for stating a hypothetical situation. but since there's a 5k cap, I'm hesitant to use it. It does cause some anxiety for me. I rather be on the safe side.