Adding delay/wait to insert/update query in Loop to avoid BigQuery exceeded rate limits

Hi,

in a workflow I'm facing a problem with BigQuery rate limits, when trying to insert/update data records:

The query for the upserts looks like this:

if not exists (select ID from BigQuery.tablename where AdrNr=CAST({{value.AdrNr}} AS INTEGER) AND AnsNr=CAST({{value.AnsNr}} AS INTEGER)) then
insert into BigQuery.tablename (
ID,
AdrNr,
AnsNr,
Namen
)
values (
CAST({{value.ID}} AS INTEGER),
CAST({{value.AdrNr}} AS INTEGER),
CAST({{value.AnsNr}} AS INTEGER),
{{value.Namen}}
);
else
update BigQuery.tablename set
ID=CAST({{value.ID}} AS INTEGER),
AdrNr=CAST({{value.AdrNr}} AS INTEGER),
AnsNr=CAST({{value.AnsNr}} AS INTEGER),
Namen={{value.Namen}}
where AdrNr=CAST({{value.AdrNr}} AS INTEGER) AND AnsNr=CAST({{value.AnsNr}} AS INTEGER);
end if

Sometimes the number of records to be upserted is more than 20, so I need to add some wait or delay to the loop, to avoid the exceeded rate limits error from BigQuery.

Any idea how to achieve this?

Thanks

Hi there, on your loop query, you can add the delay. see screenshot below:
Screenshot 2023-06-27 at 9.10.27 am

EDIT: reference earlier post to this.

In the original loop the workflow is iterating throught a JSON object which contains the data records for the insert/update and looks like this:

grafik

I'm not sure how to implement this into your example loop with the use of {{vaue}} and {{index}} ?

This way the loop doesn't get the values from the JSON objects.

In your first screenshot the iterable is called "code" so I assume the data you want to loop over sits in this piece of your workflow.

in your second screenshot you have this:

"for (const [index, value] of query6.data.data.entries()) {"

so you changed the iterable from "code" to "query6".

also is the .data.data correct? not sure how the data you're trying to loop over looks like but maybe there is one .data too many?

otherwise the code looks good. also important that you added "await" ahead of the lambda function.

The JSON data object I try to insert/update in the loop looks like this:

JSON DATA
[
{
"Ort": "Vitoria",
"PLZ": "01015",
"Sel4": false,
"PfOrt": "",
"EMail2": "augusto@forwardergroup.com",
"Na1": "Firma",
"Sel5Kz": false,
"LandBez": "Spanien",
"Tel": "+3494555123",
"EMail1": "beatriz@forwardergroup.com",
"AspNr": 0,
"ErstBzr": "LS",
"Verteiler": 0,
"ID": 7241,
"StdLiKz": true,
"PfPLZ": "",
"Sel3": 0,
"AendBzr": "EK",
"PLZOrtInfo": "E-01015 Vitoria",
"Art": 0,
"LtzAend": "03.07.2020 14:18:49",
"Namen": "Firma FORWARDER Group, SL",
"Sel1": false,
"ErstDat": "01.09.2016 14:58:59",
"AnsNr": 0,
"LcManuellKz": false,
"StdReKz": true,
"Str": "Mendigorritxu, 12",
"Land": 724,
"LandKennz": "E",
"TelAbglManuellKz": false,
"IDString": "Adr.7040.7241",
"Sel2": false,
"AdrNr": "74464",
"Fax": "+3455598765",
"AendDat": "03.07.2020 14:18:49",
"Na2": "FORWARDER Group, SL",
"InfoKz": false,
"PLZInfo": "E-01015"
},
{
"Ort": "Přeštice",
"PLZ": "33401",
"Sel4": false,
"PfOrt": "",
"EMail2": "tereza@forwarder.com",
"Na1": "Firma",
"Sel5Kz": false,
"LandBez": "Tschechische Republik",
"Tel": "+42044555123",
"EMail1": "info@forwarder.com",
"AspNr": 0,
"ErstBzr": "LS",
"Verteiler": 0,
"ID": 7242,
"StdLiKz": true,
"PfPLZ": "",
"Sel3": 0,
"AendBzr": "EK",
"PLZOrtInfo": "CZ-33401 Přeštice",
"Art": 0,
"LtzAend": "03.07.2020 14:18:37",
"Namen": "Firma International Forwarder",
"Sel1": false,
"ErstDat": "01.09.2016 15:03:06",
"AnsNr": 0,
"LcManuellKz": false,
"StdReKz": true,
"Str": "Hlávkova 54",
"Land": 203,
"LandKennz": "CZ",
"TelAbglManuellKz": false,
"IDString": "Adr.7041.7242",
"Sel2": false,
"AdrNr": "74465",
"Fax": "+4204875467",
"AendDat": "03.07.2020 14:18:37",
"Na2": "International Forwarder",
"InfoKz": false,
"PLZInfo": "CZ-33401"
}
]

The previous structure of the workflow for this looked like this

And it worked until It exceeded rate limits with BigQuery, after trying to insert several dozen data records.

In the new loop I tried to reference the code block and you are correct, it must be "code3" not just "code", but nothing works. It doesn't matter if I try

code3.data.data.entries()
or
code3.data.entries()

I even connected the loop directly to the resource query with the REST request and tried the same things, but my values in this loop are always null or undefinded.

maybe try to debug the loop by adding this before "await new promise...."

console.log(JSON.stringify(value));
console.log(JSON.stringify(value.AdrNr));

I'm not a bigquery expert so not sure if there is something wrong with the actual query but if you are trying to pass a wrong value into the query maybe you can find out with debugging via console.log

hope that helps.

Is this something you're still running into @ConDev?

I'm curious to see what the results are when trying @Lukas' suggestion of console logging. You also mentioned that the loop block was correctly updating data until it hit the rate limit right? If you switch to Code mode for that loop without changing anything does it still run properly? Could you share a screenshot of what that looks like?

The error still occurs, so we use Retool database to insert to data first instead of Big Query.