Goal: I am trying to run a postgres query to extract data and ultimately publish it to a gsheet where I just clear and replace all of the data in the gsheet tab every time it runs. My columns are persistent, my rows (including primary key = id) are not persistent, they can dynamically change.
Here's what I have done:
- set up an app
- created a postgres query: "IceVaultCommerceProductsSoldToday" with my sql.
- set up a table in my app that references my query called: icevault_commerce_table1
- set up a 2nd query, this time it's using google sheets (which I've authorized)
- I selected "bulk update a spreadsheet by primary key"
- I selected my spreadsheet and entered my sheet name
- I added my primary key as my first column "id"
- For my "array of rows to update", I simply did: {{icevault_commerce_table1.data}}
What's working:
When the number of rows of data is the same and the actual ids in the id column are persistent, then it correctly updates the values in the rest of the columns.
What's not working:
when new rows of data come in (new ids), it doesn't add them to the gsheet.
Question:
Is there a better way to do this? It feels like a very simple and basic thing (which is about all I'm capable of), but maybe I am missing something super obvious. My columns are persistent but my rows are dynamic, including id.
Thanks!