To update some Tables in our postgres db we have to perform some calculations in a workfow.
To do this we use a Python code block which does work fine as long as the computation time is below 60 seconds. However on occasion the execution time exceeds the timeout period of 60 seconds, resulting in a timout error.
Is there a way to increase the timeout peroid or another workaround for this issue?
thanks in advance!
Hey Wenkey - welcome to the forum!
To my knowledge block timeout maximum currently is set at 120000ms (2 minutes).
Depending on the use-case you can have different approach: batching into smaller sizes, ensuring the computations are done under 60s, seeing if the endpoint timeout duration can be increased etc.
I'm sure someone will be able to assist you given more details!
Thank you for the nice welcome greets and for the quick reply.
It seems like the 2min only count for query block, however for code blocks its seems like 60sek only - no matter which number we put it - even less than 60 seks.
We now made smaller batches that fit in 55seks Blocks and repeat the process every min until everything is in place
Hey @Wenkey! Python blocks should now respect the timeout setting up to the normal limit of 120s. Can you let me know if you're still having issues?
thank you for the great support and the adaption. We just tested it and it works fine now
Hey folks! Just want to report back here that as of 3.2.0 the regular timeout setting should be respected for Python blocks!