(BigQuery) endless bulk insert

Running Retool Cloud,I gave the GUI bulk insert to BigQuery another try, and again, it turns into an endless loop. None of the Retool safeguards worked: timeout (supposedly 10s), no error message, no console message.

Facts:

  1. BQ is not called, yet - as nothing in the GCP logs (should see job insertion)
  2. the formatChangesToInsert() function (massaging table changes) is not called as it should write to the console - nothing in it
  3. AlaSQL not the culprit as the above function has not passed the result yet
  4. Closing/reopening the project and the browser does not help
  5. most annoying: sometimes it works ! Seldom though

Out of thought on how to debug further... Any help is appreciated, in the meantime, I will code for individual SQL inserts - much slower.

Note: SQL bulk insert is no option as complex to implement - the GUI mode nicely parses array of objects and fills with DEFAULT for any missing fields.

Hey @yiga2! Just tried a BigQuery bulk insert from the return of a JS query, and the rows were inserted properly.

Can you share what type of query formatChangesToInsert is (ie JS, QueryJSONwSQL etc)? It won't be triggered automatically in this case, so you need to ensure that it has been run and the data exists before the insertNewActions query runs.

Can you share the entire flow that you have (table changes => format changes => insert changes) so that we can take a look at what might be going on for you?

Thanks @joeBumbaca - it turns out that the JS query I used is not run and so its data hence array is empty (more like undefined I think). This situation is not caught by the bulk insert and led to an endless run.
I have read that this was reported earlier so I guess this is a simple yet useful feature request: validate the array (+ not null/undefined) before calling the resource.

I have fixed this by switching to a JS transformer (=function) and returning its value - this is always executed when invoked.

1 Like