How to efficiently make large amounts of API calls in JS/Retool?

Got it, I'm not entirely sure this is the issue you're running into but if you define the function recursively setting a delay won't actually reset the stack.

Since recursive functions are first in last out, call 0 needs to wait for call 1 to complete before it leaves the stack etc. so you'll need to have a multiple of the 300k calls on the stack at some point. Using for loops gets around this since call 0 can complete independently of call 1.

As long as the logic is nicely packageable in that format (which it seems to be here) they're likely the way to go.

You can also do something like this to run the queries in parallel.

Batching your queries would depend on how you're choosing to trigger them. Using a for loop you could do something like this (note that there is no delay):

const batchSize = 100;
const batchLimit = rows.length / batchSize;

for(let batchNumber = 0; batchNumber < batchLimit; batchNumber++){
  const batchStart = batchNumber * batchSize;
  const batchEnd = batchStart + batchSize;

  console.log("Running batch " + batchNumber);
  for(let i = batchStart; i < batchEnd; i++){
    const data = rows[i];
    console.log("Running query for row" + i);
    uploadCall(data, i);
  }
}

console.log("Finished running all batches");

Using await uploadCall is only for if you want the current request to finish before triggering the next one. At the moment, uploadCall waits for UploadGameStatsToFirebase, but runQuery doesn't wait for uploadCall so the await is effectively ignored.

Let me know if any of that is helpful! There are a couple options here, I'm curious to know which one you'll take and will try and answer any questions that come up related to it (e.g. other forms of batching) :slightly_smiling_face: