Hi there,
I'm trying to implement the following workflow:
User uploads a CSV of new products to a file dropzone. The data is parsed.
A JS function takes that data and pads all of the 'upc' values with leading 0's to ensure they meet our 14 digit standard.
That new array is used in a SQL query to insert the new product records to our database table.
Up until this point I've had no issues. This has been a process running successfully for some time using a simply bulk insert from {{jsquery.data}} built in the GUI editor.
Our database has a uniqueness constraint on the 'upc' field, and we're running into issues where users are uploading 1,000 product catalogs with 5 duplicates included accidentally. In those instances, I don't want the upload to fail, I want it to simply skip those 5 (they're already in there after all) and get the other 995 inserted.
I thought I could simply use an ON CONFLICT DO NOTHING statement, but this requires me to use the raw SQL editor and I'm having all kinds of issues passing the values from the JS function into the SQL query and I think it's because I have the prepared statement functionality enabled on my database resource.
I'm curious how folks here might approach this. Hopefully I'm missing a feature or a workflow. In a perfect world, the simple GUI editor would include certain options like how to handle conflicts, but that doesn't seem to be the case.
Here's the JS function:
// To transform results function
const transformedData = fileDropzone13.parsedValue[0].map(row => {
//
// Create a shallow copy of the row
const newRow = {...row};// Check if upc exists
if (newRow.upc !== undefined && newRow.upc !== null) {
// Ensure it's a string and pad if needed
const upcString = String(newRow.upc).trim();
if (upcString.length < 14) {
newRow.upc = upcString.padStart(14, '0');
}
}return newRow;
});
return transformedData;strong text