+1 here.
+1 here
I know this is an older thread but I recently came across a similar problem when writing a workflow where I needed to bulk upsert a large number of records to a Retool DB table that had a composite primary key. Looping one by one risked hitting the Retool DB query limit.
What worked for me was to first convert the data I wanted to upsert into a single JSON string, and then in my SQL code block, I unpacked the JSON and upserted it.
STEP 1 - create the JSON string:
This is a regular js code block in the workflow
// 1. Get your source data
const records = {{ source.data }};
// 2. Define the exact column order. This must match your SQL table. In this example, column_name_1 and column_name_2 make up the composite key.
const columnOrder = [
'column_name_1',
'column_name_2',
'column_name_3',
'column_name_4',
];
// 3. Map the records into a 2D array, ensuring all values are strings.
const arrayOfArrays = records.map(record => {
return columnOrder.map(colName => {
const value = record[colName];
return (value === null || value === undefined) ? null : String(value);
});
});
// 4. Return the entire dataset as a single JSON string.
return JSON.stringify(arrayOfArrays);
STEP 2 - unpack the JSON and upsert
This is a SQL block
WITH data_to_upsert AS (
SELECT
-- Use the ->> operator to get JSON array elements as text.
-- NOTE: JSON arrays are 0-indexed!
-- Then, cast the text to the correct column type.
(d.row_data ->> 0)::varchar AS column_name_1,
(d.row_data ->> 1)::integer AS column_name_2,
(d.row_data ->> 2)::numeric AS column_name_3,
(d.row_data ->> 3)::timestamptz AS column_name_4
FROM
-- This expands the JSON array string into a set of processable rows.
jsonb_array_elements({{ step1result.data }}::jsonb) AS d(row_data)
)
INSERT INTO your_table_name (
column_name_1,
column_name_2,
column_name_3,
column_name_4
)
SELECT * FROM data_to_upsert
-- Use your composite key here for conflict resolution. This assumes that columns 1 and 2 make up the composite key
ON CONFLICT (column_name_1, column_name_2)
DO UPDATE SET
column_name_3 = EXCLUDED.column_name_3,
column_name_4 = EXCLUDED.column_name_4;
Hi @paul_mcardle, thanks for coming by with a workaround! This worked for me as a bulk upsert although I had to slightly modify it, changing the jsonb_array_elements line to this:
jsonb_array_elements('{{ step1result.data }}'::jsonb) AS d(row_data)
to avoid this SQL error: syntax error at or near "["
As to this being implemented in the query GUI mode, there is still nothing to update on our end. Until then I would suggest using the above solution in SQL mode or making sure your data sources have a unique primary key id.
Hi all, just marking a solution so that if anyone comes to the thread they can easily use the workaround. It will remain open since there is still a feature request in for it to be handled by the GUI.