External Data Source status 503

  • Goal: Query big datasets from snowflake in retool

  • Problem: Big datasets queries return a {"status": 503, "message":""} error.

  • Steps: I checked the timeout and by default it is 10 seconds (it is greyed out). So I changed the DBCONNECTOR_QUERY_TIMEOUT_MS to 30 seconds

Increasing the query timeout helped but now I am getting a 503 error and this only happens with big datasets.
Small datasets I can query without any issue.

Any help would be appreciated.

Thank you!!

Hi @cosmos,

The two options would be to either continue to increase the DB connector timeout, or you can optimize your query and have it grab smaller 'batches' of the larger query.

For example if you are displaying the snowflake data in a Table component, you could use server side pagination to only query a specific smaller number of rows that are being displayed. Then when users navigate to the 'next page' of the table, the next smaller query will run for the next pages worth of rows.

In most cases you don't need all the data at once, which is where server-side pagination can be super helpful. If you want a very large amount of data all at once, you would need to increase the timeout settings in Retool and the Snowflake server/DB. Also you could have a workflow query loop, grab the data in batches and return this to an app.

All of these options have tradeoffs based on the use case, hopefully this helps!