Hi, I am trying to connect my external resource connection (Redshift database) in Retool to my local retool database so that all the tables and data in the Redshift database is exported into my retool database daily. I tried to create a workflow to export the tables but the Redshift database is too big (almost 1GB) to do a .csv export and import. I also noticed there is no way to do a sql export and import in Retool. Could someone help me out in finding a way to fetch data so in the end I will have one Retool Database which contains all data of the Redshift Database.
Hello @mdarbha!
Unfortunately the only way to import data into a Retool DB instance is via CSV import, but you should be able to chunk the data from your Redshift database using a python library such as pandas to loop through your data and create a number of smaller CSV files than can themselves be imported in to Retool DB.
My best advice would be to use a Retool Workflow as it has the capability to run python in loop blocks to automate and expedite this process!
I would also ask, is there a strong reason why you want to move this data to Retool DB as opposed to another self-hosted Postgres (or similar SQL DB)?
As there are likely many tools for data migration that would work better than the loop-and-break-down-via-python-to-CSV-imports that you would implement for Retool DB, and there are many online resources for building out migration code that does not involve CSVs.