Dealing with local data

I'm building a frontend on a supabase database. Which is working quite well, but the database has to be filled with exports from a WMS. Specific WMS does not supply any API and is completly on premise. But for every order a JSON will be generated - about 100/200 a day., I'm loooking for a solution to parse this local JSON to the cloud based database solution.

Probably through a Retool workflow, as the JSON need some adjustments.

I'm wondering what possibilties I could consider to solve this.

How do you plan on accessing the JSON from the WMS?

As soon as an order has been processed and reaching a certain status this JSON will be generated. Of course I can upload these files with a file upload field, but due of the ammounts I'm looking for possibilities/ideas to automate these task.

OK but where is the JSON generated? In the WMS? If so and there is no ability to connect to it - I assume this would have to be done manually...

Correct, it's stored locally - how to parse these data the easiest way into Retool.

I cant say I have come across this situation before. Is it possible to export any data from the WMS in excel format?

1 Like

Yes, data could be exported as excel aswell.

That is your best bet...
You can upload data directly into the Retool database with the excel spreadsheet

Yes, but I don't think there is an endpoint where I can upload these files. To make use of this solution I have to upload these files manually, right?

Yes otherwise you would have put them somewhere else like Google and then call google read them in and then insert into the database...
How much data? Retool DB does have a limit...

currently Supabase is the backend - but Retool is used as frontend, and Retool Workflow to manipulate the data before parsing into Supabase.

The WMS generates JSON but doesn't have any method to send it? Like to a webhook, email, sftp, etc...?

Oh ok and yes same question as @matth

Email is no problem, would that make the solution easier?

You could get email to work. It introduces some complexity and another service. I did something similar to parse attachments on order emails from a vendor. Postmarkapp has a service called Inbound Email that forwards an email to a webhook url.

WMS -> Email JSON to Postmarkapp -> Retool Workflow -> Supabase

Hi @Ronaldvm, have you considered a Python script that runs locally and that is monitoring the folder and inserts the data straight into your db?
It would require a local server and python install, but it is very efficient and only requires 20-30 lines of code.

I’m uploading all my on premise data to online databases and it’s quite reliable.

Not yet, I'm not really into python, so I'm a bit hestitant on installing a local server and relying on it. As I'm not sure I can assure the service.

Another solution I’m using as well for exchange of xml file for our accounting application is AWS S3 sync using the aws cli.
I have created a batch script that runs every 2 minutes and downloads xml files from a s3 bucket, those get processed by the application and the resulting xml is uploaded again to a different bucket.

From the bucket you could ingest it in your db using a retool workflow.

I guess this might be a better solution, i'll take a look at it, aswell as the postmarkapp, thanks for your suggestions

What's the wms called? Or was it written internally?

What's the use case of the frontend? View order data, stats, etc?

How is the order data managed now?