Hey @jignesh - bulk update queries are on a table by table basis - do your individual tables not have primary keys at all without joining to other tables and creating composite primary keys?
Thanks @jignesh! Yep, unfortunately our bulk update option only works with a primary key that’s one of the columns of the table that you’re trying to update. So unless you have a column in table_xy_join that represents the primary key, it’s not going to work. Do you think you’d be able to add in a placeholder column for that compound key or something like that?
Good news @jignesh - @alex had an idea that might work. Our GUI for SQL bulk updates is limited to a single primary key, but you can also just execute arbitrary SQL against your database as well via our editor. If you connect your resource as a “read” resource (you’ll see the two options when you connect a data source), you should be able to run UPDATE statements manually in the editor. Would that work?
@justin@alex
Can you please help me with how to do bulk updates from the table’s ‘recordUpdates’ array and only with changed data query using this manual update option?
Hey @jignesh! So the .recordUpdates property contains an entry for every column in your table - if any values have been updated (i.e. are editable in the table), those new values will appear in the property - if values haven't been updated, then those non-updated values will appear. For a quick example, let's say I have a table with 3 columns, and I update two of them:
I've only updated the first two columns (I changed the customer ID to 645, and the store ID to 21) - the email column has remained untouched. Here's what table1.recordUpdates will evaluate to:
You'll notice that even though we haven't updated the email column, it still appears - we do this to make it easier to issue UPDATE statements to your database / API. So if you wanted to issue an update query with this new data without using our SQL GUI, you could write something like this (depends on your SQL dialect, of course):
You'll notice that the email hasn't changed, but we're still including it here because it makes it easy to just have one update query that can handle any changes to your table (this query would still work if we only updated one column, or two, or three).
Note that if you update multiple rows then you can reference those updates via indexing .recordUpdates at [1], [2], etc.
Thanks, @justin for the quick response.
Yes, I completely understand these till now - index-based records updates using a single row update query.
update multiple rows then you can reference those updates via indexing - I didn’t get this part yet how can do these in retool - need some another top-level JS-based query to loop through? any reference?
I’ve worked around this by creating a unique column which is a concatenation of the other columns, and you can use that as the primary key for the update
Updating multiple rows with a bulk update query (using any data source: SQL, MongoDB, etc.) - to do this, you’ll need to create a query to update one row, and then use a JS query to apply that query to every updated row. I made a video for how to do that in Firebase here, but the logic applies to SQL as well: loom.com/share/91d5c6c566394731b5bd7b17fbf9b626
Dealing with composite primary keys in SQL - discussed above in the thread
I'd really like to use composite primary keys for bulk updates, any chance of support coming for this? I would add a virtual generated column, but one of my primary key columns is autoincremented and mysql doesn't support generated columns based on autoincremented columns.
@justin Followup question, I just followed your video to write my own JS query that iterates through recordUpdates and trigger a single-line update query. Just like in your video, my table doesn't update after running the JS query, even though I've got my select query set to run on completion. Manually refreshing works. Did you figure out what was going on there?
Hi @andrew2, I just followed the same video, and got the same issue where the table view doesn't update properly upon saving, and needing to press refresh again after save.
What's happening is that an update to the main view happens immediately after the JS query completes, but the JS query completes virtually instantly because it doesn't wait for all of the other database queries it triggered to complete. In my case, the JS function completed in under 0.1 seconds and triggered an update to the view, but the actual SQL update queries only completed about 1.5 seconds later.
This can be fixed by forcing the JS function to complete only after all of the database update queries complete. The query.trigger(...) function returns a promise which can be awaited. In my case I did something like
let promises = rows.map((data) => query.trigger(...))
await Promise.all(promises)
which still fires off the bulk updates in parallel, but will force the JS function to wait for them all to complete successfully before finishing.
Would it be possible to avoid all the row by row Promise approaches and to just feature request the functionality of a composite primary key option in the bulk update/insert module please?
Update - @justin I just discovered that your solution immediately saturates the 100 GCP database connections allowed so even the promises approaches doesn't work.