Hi Aaron,
The workaround we ended up using was to create the upsert operation as a function (in the left navigation bar) instead of a workflow step.
Then, with JavaScript in a workflow step, we (giving an example of 10,000 rows):
- Broke down the 10,000 rows to 20 batches of 500 rows each
- Fired 5 batches in parallel using the upsert function we created, at once, waiting for the slowest of them to finish
- Kept firing groups of 5 batches until we ended it all.
We found 500 rows per batch to be the "sweet-spot" for us when considering database load, performance, and retool workflow efficiency, but you can change it as you wish.
This will basically perform 20 smaller upsert operations instead of 1 huge one, and for, this resolved multiple issues we've had with this bulk upsert.
Hope this helps