I did some research but could not find an example close to what I need. I'm using the workflow feature to run a query, and based on its results, insert rows into a table. The issue is that the query can return up to 7,000 results, which causes the insert to time out.
To work around this manually, I've been limiting the first query to 1000 rows, running the insert, and repeating until the first query is empty.
I'd like to automate this process so it runs in one go. But I could not figure it out how to make the loop block work.
I've been passing around this loop setup recently to help with error handling:
Is there a reason you need the 7,000 entries to be inserted one at a time? You should be able to pass the array of entries using the Bulk Insert or Bulk Update/Upsert via Primary Key using the GUI setup for your DB's table connection.
Hey @Vitor_Dal_Pra! I'll second @pyrrho's question and add that it doesn't feel like a bulk insert should time out with "only" 7000 records, unless you've left the timeout at 10s.
If you haven't already done so, try upping the timeout to 30s and let me know if it is still timing out. I'm happy to help you figure out the looping but it feels like you may not even need it!