How can I get one of these CSV datasets to a Retool PostgreSQL database? I attempted the import CSV tool which let me know the limit was 3mb (which seems small). The dataset I am trying to upload is a 62mb CSV. Not sure the simplest way to import that data.
I had a similar problem, I've solved relying on an additional service TinyBird, that allows to ingest a CSV and in a couple of minutes have a public endpoint I've consumed in Retool.
There isn't currently a native way to do this, but we do have a ticket tracking this internally. We'll update this thread when there is any new information to share.
Hi, I'm late to this thread. One way you can do this with Retool yourself is parsing the CSV file and use "Bulk Insert" functionality on Postgres Query GUI mode. It supports inserting a large number of rows at once.
Yeah, ideally it would be able to handle bigger file size, but there may be some limit with parsing & handling big files on the server side (e.g. it being slow and blocking other queries). The 3mb may not be a hard limit but a safe limit for us currently, sadly.
I will flag it to the team so we can look into expanding it.
Are you able to provide more details? I am new to Retool and need to load some data into my retool account for testing app building. I have large csv files as well.
Goal is to load data in multiple tables and learn to build app and dashboard around Retool database tables
Hi, do you know if there is an update or a workaround for the 3MB limit?
Our use-case for retool is to allow non-technical users to be able to download, edit, and upload updated csv files to the server, so "externally connect to the db" or similar is doable for me, but not them.
I have been able to insert the data using the Query Library GUI mode, but am worried that even the "convert csv to json" part of that might be a limitation for them.