Through some trial and error, I've learned there is a parameter limit on bulk insert.
My example is using a fileButton to upload and parse a .CSV with 48 columns and 161 rows. The following SQL bulk insert returns: error:"Unprocessable Entity"
The only way to get the insert to work with this data is to reduce the number of fields (columns * rows.)
Hoping some of you are importing CSVs and can point me in the right direction.
Full disclosure: I'm a non-developer lured in by the promise of low-code! It's going well so far - this is my first situation I couldn't Google my way out of.
Using a little JavaScript can be really helpful for batching your data so that you can insert it in smaller chunks. Of particular usefulness is the ability to trigger a query with additional scope (docs). Using that along with the _.chunk function you can do something like:
Which triggers your insert query for each batch created by the _.chunk function, passing that batch to it as well. Meaning that, in your insert query, you can reference {{ batch }} It'll show up as undefined but that's because it's only defined at the moment the query is triggered in the script!
Nice @Kabirdas - I was just wondering where the batch size parameter was. Maybe something to include/support natively in the future. Is there a built-in hard limit though ?
My use case: I have a 22Kx17 table in BigQuery that I want to copy in retool db for performance reasons with filters pushed to db, as my analysis requires many filtering (drill-downs). That table needs refreshing every day or so hence a bulk upsert.
However, w/o custom batching, would the engine insert 22K in one request/commit ?? That would fail I guess.
Also, I was playing with workflows for the same but not sure how to insert in chunks.
Note, don't run too many concurrent batches as each consumes a connection to PG (for retool_db), and I got errors when querying from db dashboard that all connections were used.
Adding more built-in support for loops and batching is definitely an idea that has come up before. And thanks for surfacing the issue of hitting the connection limit between Workflows and RetoolDB here as well. It's also something a number of other users have run into recently.
I'll report back here if there's an update pushed related to some of the internal tickets we have for tracking these issues!