I need your help.
Hey @Geool! Is your transformer stand-alone (i.e. a seperate code item in the bottom left), or in a success event handler in your REST query? If the former, you may wish to try the latter so the transformer only executes when your REST query returns data.
So it's a single transformation because I get 5 tables with a lot of data (+50k lines per table) and then do a merge according to defined criteria to get a single table at the end.
But the queries of the 5 tables is long, even very long ...
is there a method to optimize the loading time of the 5 tables?
is there a method in my transform to wait for the results of the 5 tables before executing?
Thanks in advance
async/await on different query triggers.
For instance, you might do something like this, triggering the queries at the start of your script:
const mdlwPromise = mdlw_getAllContracts.trigger().then(formatDataAsArray);
const docusignPromise = docusign_get_evenelopes.trigger();
const [mdlw, docusing, /*...etc*/] = await Promise.all([mdlwPromise, docusignPromise, /*...etc*/]);
Then continue with the script as you already have it.
Thank you very much @Kabirdas for your help.
This solution works very well indeed.
Maybe you can help me with another problem related to this.
My queries take a long, long time to run... from 10 to an infinite second at times... and it's quite random when the page is refreshed.
I call 3 different APIs and 1 DB ....
Is there a way to optimize this?
Is it a good practice to put the data in the cache?
Query caching is a great idea! There are some docs on it here. Since you're experiencing inconsistent query runtimes in particular there could be a number of things going on though. We have some general docs on debugging query performance, but it would be good to know the resource type of the queries and specifically how each one is behaving.
I confess that I do not understand why the requests take so much time...
I've been trying to optimize all this for 2 days but nothing works...
the requests take between 10 and 400seconds to execute despite the caching of the data...
if you can look at the code eventually I'm interested because I'm not making any progress...
from the screenshot you posted it looks like there are a few REST queries along with a few Postgres queries. Is there any difference in how the two are behaving? It might also help if you could share some example screenshots of the query runtimes from a couple of the longer running requests:
Here are the screenshots as requested:
Can you explain why this request takes 47 sec to execute while on Postman for example it takes less than 1sec?
The time is random but never takes less than 15/20sec which is a lot for a user.
What is the frontend (post processing)?
Same thing for these two requests too, long to execute while postman or elsewhere it's 2/3 sec maximum ...
thanks for your help.
It looks like there are at least two separate issues here:
- Post-processing is the time it takes for your frontend to update the model and re-render components once the data has been returned. What components are you rendering the data in? If it's a table, what column types are you using and what data mapping do you have in the component itself? Are there multiple components that are dependent on this data, maybe down the line (e.g. is there a table that renders this data, and then another component that renders based on a property of the table, etc.)?
- The backend execute resource and data transfer times respectively have to do with the amount of time it takes the Retool backend to query your API and receive data, and how long it takes to transfer that data to your frontend. Since you're hitting an EU endpoint this might cause additional latency but I'm not sure that would account for the full 15 seconds from the screenshot. Would you mind if I stepped into your app and tried running the requests to see how the request times differ when run from a machine closer to the Retool host server?
I'm also curious to know where you're seeing the most variance in the query run times. You mentioned it always takes 15/20 sec to run, is that time distributed across backend and post-processing? Or does one of them seem to be consistently slow?
Yes you can look at the application.
The data are not loaded in visible.
They are in the inputs to have perfect matches, then once the search is launched they appear in a table and finally once the line is selected, the data appears in a modal...
And even with cached data the query takes time to run
But I do not understand why the frontend time is specific to mdlw_get_allcontracts when it is not this component that is called in the application but only _fullData_mdlw_tre_doc_prest ...
You can look to help me to clarify these problems.
Thanks for letting me take a look at the app! I'm seeing similar
execute resource times with a significantly lower
transfer data time which might indicate some of the issue is distance latency:
If you run the
mdlw_get_allContracts using a tool like pgsql are you also seeing significantly shorter execution times?
The post-processing time that's happening does seem to be a result of the JS code you're running. You can test it with something like this after the
await Promise.allSettled call in your code:
const start_t = new Date();
your code here
console.log('time to run:', new Date() - start_t);
When trying that on my end the time reported there seemed to match with the post-processing time for the query. Can you let me know if you see the same on your end?
It might be useful to introduce some kind of pagination but I'm not sure if that would fit your use case, is there a particular reason that you're fetching all of the data at once?
Thank you for all your answers.
Here is the result I get with one of the longest queries I have: odoo.
Concerning sql, I run the query on dBeaver and here is the result which is instantaneous.
I also did two explain to see the difference between Retool and dBeaver: here are the two results.
Concerning the data retrieval, I retrieve all the data to put them in a single table with a merge between them or not. And I retrieve this one to inject them in my inputs in order to have perfect searches/correspondences... Why I did not paginate ? Simply because I don't know how to do it and I never did it...
Maybe this can solve the problem?
I think paginating would make the one query go faster but since it looks like there might be a significant amount of latency because of the distance to the server making more requests to your DB might not be ideal. You might consider using something like
localStorage to store your data. How often does the data update? Is it possible to request just the updates?
Also, how much of your data does the user need to act on? There may be other filters that are beneficial as well.
As mentioned here EU hosting is a project that's being worked on! Hopefully, that helps your query performance as well when it arrives.
Okay I understand better the latency then if it communicates with the American servers.
For the LocalStorage, it is possible to imagine everything in reality as long as the development time is not too important. But again I don't know how to do it.
Can you help me with this. The base is called once massively and the evolutions are minimal in reality because they will be mainly additions in the base which exists at the rate of perhaps 50/100 lines per day.
Thanks again for your help.
It looks as though there's a significant amount of logic in the script you have that's processing the query results so I'm hesitant to make a super strong recommendation here. You might want to explore looking at the developer network to see if you can find someone who can work more closely on your app with you.
localStorage, you can set it using the
setValue method described here. Once you've done so, it should persist between apps. You can then access the values you've set using
That being said, it's possible that refactoring your code could make it easier to paginate your query and also just increase performance generally and might be a better way to go than storing things in localStorage.
thank you for your feedback.
It is not possible to delegate the work of this application.
for the LocalStorage, what should I put in it?
my js files ?
my call to the Postrges DB ?
my API calls to trello, docusign, prestashop, odoo ?
I'm a bit lost ^^
It's actually a built-in variable that you can reference using event handlers and scripts:
It's one of the ways Retool enables sharing data between apps (docs here).
After setting it you can access the same variable in other apps:
Note: I'm using a large dataset in the screenshots above which seems decently performant but I'm not doing a lot of other things with it. I don't know how much this will help and you'll still need to process the data at some point which I expect will take time.
I am trying to query Odoo - can you share how you've setup the "odoo_master_post_res_partner" query in retool? Is it using Odoo's xml-rpc API, or are you interfacing with the underlying postgres database?
For odoo, I initialized it as an api with login / password.
Then I made a classic request in jsonrpc.
POST : URL
Headers : Content-Type -> application/json
Body : Raw
[ YOUR_FILTERS : "name", "id", "city" ... ]
AND IN TRANSFORMERS YOUR_ALGORITHM