Recursive bulk insert into postgres database fails

I'm trying to bulk insert some records into a table. Table has ~15 columns — most are text, one is jsonb.

But the bulk insert is failing every time. There are about 350 records, which doesn't seem like that many to me, but maybe it is?

Just in case it is too many, I tried using a jsQuery to chunk the array of records into smaller groups and then recursively call the bulk insert query to insert one chunk at a time. This works somewhat, but part way through I always get an error, usually something like: "ROLLBACK - Client has encountered a connection error and is not queryable".

I tried setting various chunk sizes, with the following results:

  • 5 per chunk: bulkInsert query takes ~ 20 seconds and then fails after the 5th or so iteration (25ish records)
  • 25 per chunk: bulkInsert query fails before finishing
  • 10 per chunk: bulkInsert query takes ~ 40 seconds and then fails during the 3rd or so iteration (25ish records)

Other information that might be relevant:

  • The jsonb column of some of the records is holding a lot of data
  • When querying the database directly (outside retool), I can select the rows with smaller jsonb column s(407 records) in 4 seconds; pulling rows that contain larger jsonb columns takes much longer (110 rows in 60 seconds).
  • When retool throws the connection error for the bulkInsert query, I also lose the connection to my database outside retool.

Any ideas what might be happening here?

Welp, I may have just answered my own question.

It seemed like my database itself was crashing, so I tried resizing it (doubling the RAM) and now it works!

@kschuler can you share how implemented the chuns? Trying to do the same but no luck yet.