I'm building a frontend on a supabase database. Which is working quite well, but the database has to be filled with exports from a WMS. Specific WMS does not supply any API and is completly on premise. But for every order a JSON will be generated - about 100/200 a day., I'm loooking for a solution to parse this local JSON to the cloud based database solution.
Probably through a Retool workflow, as the JSON need some adjustments.
I'm wondering what possibilties I could consider to solve this.
As soon as an order has been processed and reaching a certain status this JSON will be generated. Of course I can upload these files with a file upload field, but due of the ammounts I'm looking for possibilities/ideas to automate these task.
Yes otherwise you would have put them somewhere else like Google and then call google read them in and then insert into the database...
How much data? Retool DB does have a limit...
You could get email to work. It introduces some complexity and another service. I did something similar to parse attachments on order emails from a vendor. Postmarkapp has a service called Inbound Email that forwards an email to a webhook url.
Hi @Ronaldvm, have you considered a Python script that runs locally and that is monitoring the folder and inserts the data straight into your db?
It would require a local server and python install, but it is very efficient and only requires 20-30 lines of code.
Iβm uploading all my on premise data to online databases and itβs quite reliable.
Another solution Iβm using as well for exchange of xml file for our accounting application is AWS S3 sync using the aws cli.
I have created a batch script that runs every 2 minutes and downloads xml files from a s3 bucket, those get processed by the application and the resulting xml is uploaded again to a different bucket.
From the bucket you could ingest it in your db using a retool workflow.