-
Goal: Updating multiple rows in batches to an SQL database
-
Steps: The initial goal was to use the built-in function of 'bulk insert records' to upload multiple rows at the same time. At most the records have 20 attributes. This frequently gave an error without any further messaging or indication where the error occurred (I forgot which error that was exactly). After a long search I tried out batching the inserts with the help of Parameter Limit on Bulk Insert. This seemed to help sometimes, but now again no matter what I change, I get "Transaction has been aborted". I tried changing the batch size, making the triggers synchronous by using async functions and await, but nothing seems to help.
-
Details: Batcher code is structured like this:
const batchSize = 50;
const batchedData = _.chunk(table2.data.filter(d => d.added && !d.error), batchSize);
batchedData.forEach(batch => {
insertRows.trigger({additionalScope: {batch}})
});
The actual insertRows function is structured like this:
{{
batch.map(d => {
console.log(d)
delete d.added
delete d.error
Object.keys(d).forEach(k => { d[k] = d[k] ? d[k] : undefined })
d.AMOUNT = d.AMOUNT || 0
d.AMOUNT_VAT = d.AMOUNT_VAT || 0
return d
})
}}
The console log inside the map function does not log anything. Even if I try to log a statement before the batch.map function, nothing gets logged. I do however, can log the content of the different batches in the batcher code.
The error I get is insertRows failed (3.437s):Transaction has been aborted.
Any insights in what might cause this behavior is greatly appreciated.