I've used workflows with a webhook trigger for this type of use-case before. For updates, you can post the data you want to update/insert to the webhook, and configure the workflow to insert the data from the payload into your postgres table. There's a template that is similar to this, called "Sync API to Retool DB Postgres Database" you can use as a starting point.
You can also use webhook-triggered workflows for reading data, by sending an ID, or something of the row you want to pull and configuring a Response Block. They have an "API Endpoint" Template (screenshot below) to use as a starting point for this.
hey @ericnograles
Since you've already found the URLs with the JSON data you need, you're off to a great start. For parsing and saving the data into your database, you could use a combination of Python and libraries like Requests to fetch the data and Pandas to parse and structure it. Hereβs a quick example:
Fetch the data using requests:
import requests
response = requests.get('your_json_url')
data = response.json()
Parse and structure the data using Pandas:
import pandas as pd
df = pd.DataFrame(data)
Save the data into your database (assuming you're using SQL): You can use a library like SQLAlchemy to connect to your database and save the data:
from sqlalchemy import create_engine
engine = create_engine('your_database_connection_string')
df.to_sql('your_table_name', engine, if_exists='replace')
If you're doing a lot of scraping without an API, you may also run into rate limits or get blocked. In that case, using a billing management tool like Unibee could help you some way I guess. . You can check it out ( Open-Source Billing and Payment Management Software | UniBee).