Issue with passing JSON results to a SQL query with prepared statements enabled

Hi there,

I'm trying to implement the following workflow:

User uploads a CSV of new products to a file dropzone. The data is parsed.

A JS function takes that data and pads all of the 'upc' values with leading 0's to ensure they meet our 14 digit standard.

That new array is used in a SQL query to insert the new product records to our database table.

Up until this point I've had no issues. This has been a process running successfully for some time using a simply bulk insert from {{jsquery.data}} built in the GUI editor.

Our database has a uniqueness constraint on the 'upc' field, and we're running into issues where users are uploading 1,000 product catalogs with 5 duplicates included accidentally. In those instances, I don't want the upload to fail, I want it to simply skip those 5 (they're already in there after all) and get the other 995 inserted.

I thought I could simply use an ON CONFLICT DO NOTHING statement, but this requires me to use the raw SQL editor and I'm having all kinds of issues passing the values from the JS function into the SQL query and I think it's because I have the prepared statement functionality enabled on my database resource.

I'm curious how folks here might approach this. Hopefully I'm missing a feature or a workflow. In a perfect world, the simple GUI editor would include certain options like how to handle conflicts, but that doesn't seem to be the case.

Here's the JS function:

// To transform results function
const transformedData = fileDropzone13.parsedValue[0].map(row => {
//
// Create a shallow copy of the row
const newRow = {...row};

// Check if upc exists
if (newRow.upc !== undefined && newRow.upc !== null) {
// Ensure it's a string and pad if needed
const upcString = String(newRow.upc).trim();
if (upcString.length < 14) {
newRow.upc = upcString.padStart(14, '0');
}
}

return newRow;
});
return transformedData;strong text

Hi @pod2,

You could remove the duplicates in JS before inserting as long as you are aware this will not prevent duplicates of existing data.

Something like this should work:

const seenUPCs = new Set();

const transformedData = fileDropzone13.parsedValue[0]
  .map(row => {
    const newRow = { ...row };

    if (newRow.upc !== undefined && newRow.upc !== null) {
      const upcString = String(newRow.upc).trim();
      newRow.upc = upcString.padStart(14, '0');
    }

    return newRow;
  })
  .filter(row => {
    // Filter out duplicate UPCs
    if (!row.upc) return false; // skip rows without a valid UPC

    if (seenUPCs.has(row.upc)) {
      return false; // duplicate, skip it
    } else {
      seenUPCs.add(row.upc);
      return true; // unique, keep it
    }
  });

return transformedData;

Otherwise, in raw sql something like this should work:

INSERT INTO products (name, upc, description, price)
SELECT 
  x.name,
  x.upc,
  x.description,
  x.price
FROM jsonb_to_recordset({{ transformUPCs.data }}::jsonb) as x(
  name text,
  upc text,
  description text,
  price numeric
)
ON CONFLICT (upc) DO NOTHING;

So, I used this:

INSERT INTO products (brand, sku, style_code, product_nm, upc, custom_field_1, custom_field_2, custom_field_3, custom_field_4, custom_field_5, unit_count, pack_quantity)
SELECT
x.brand,
x.sku,
x.style_code,
x.product_nm,
x.upc,
x.custom_field_1,
x.custom_field_2,
x.custom_field_3,
x.custom_field_4,
x.custom_field_5,
x.unit_count,
x.pack_quantity
FROM jsonb_to_recordset({{ fourteenDigitUpcConversion.data }}::jsonb) as x(
brand text,
sku text,
style_code text,
product_nm text,
upc text,
custom_field_1 text,
custom_field_2 text,
custom_field_3 text,
custom_field_4 text,
custom_field_5 text,
unit_count numeric,
pack_quantity numeric
)
ON CONFLICT (upc) DO NOTHING;

...and got the following error:

"invalid input syntax for type json"

If you write:

FROM jsonb_to_recordset({{ fourteenDigitUpcConversion.data }}::jsonb) AS x(...)

But fourteenDigitUpcConversion.data is a JavaScript array or object, Retool will substitute it directly—not as a string—so your query ends up with something like:

jsonb_to_recordset([object Object],[object Object]::jsonb)

That’s invalid JSON syntax, hence the error.

Fix: Use JSON.stringify(...)

Ensure that the JS block returns a properly stringified JSON before passing it into the query:

In your JS transformer/query:

const padded = fourteenDigitUpcConversion.data.map(row => {
  // ... pad logic
  return row;
});

// Return a stringified JSON
return JSON.stringify(padded);

Then in your SQL query:

INSERT INTO products (...)
SELECT
  x.brand, x.sku, x.style_code, x.product_nm, x.upc,
  x.custom_field_1, x.custom_field_2, x.custom_field_3,
  x.custom_field_4, x.custom_field_5, x.unit_count, x.pack_quantity
FROM jsonb_to_recordset({{ transformStringified.data }}::jsonb) AS x(
  brand text, sku text, style_code text, product_nm text, upc text,
  custom_field_1 text, custom_field_2 text, custom_field_3 text,
  custom_field_4 text, custom_field_5 text, unit_count numeric, pack_quantity numeric
)
ON CONFLICT (upc) DO NOTHING;
2 Likes

This appears to have worked. As always, very appreciative of the community's help!

2 Likes

Glad you got it working! Feel free to reach out anytime.

1 Like