I have a Run JS Code query that get's data from my restapi, I need this js query to get all data from my rest api as it has limits to fetch all. In my app I have a table that is filed with the response from the js query. This works all OK but the table only has data when my js query has run, this means I need to have the option run this query on page load active otherwise I have no data.
Is their any other option to always start with cached data so it does not need to run everytime the page is loaded?
I saw their is a known bug where js queries are not cached at the moment. I tried to use a query json with sql on that js query which also works, cache is enabled but still is always needs jsquery to run first.
On your own side you could cache it to your own postgres, or you can store it with Retool Database. Beware that you then need to manage the data update yourself.
Thanks, I am learning a lot the last days good stuff.
So I have setup a postgresql database, added the table etc all good.
I tried to use the action type bulk upsert but the query will time out after 60 seconds.
I saw some topics with chunks and will explore that but most likely way out of my technical capabilities
Then I thought let me just try a small dataset through update / insert but then I come back to
my biggest challenge at the moment, understand how to use a dataset with multiple records
You'll want to pass in an array of records to update. This can be {{ table1.data }} or {{ yourApiQuery.path.to.your.data }} or wherever the dataset you want to send comes from!
Most of the chunking examples rely on using additionalScope to send the data and if you choose to go that route you can using something like {{ chunk }} which doesn't have to be defined when you write it into your query as long as it's defined when you trigger it!
In case you haven't bumped into this one in particular, here's an example script you can use in a JS query to trigger your bulk upsert query: