Simplest way to load a large (62mb) csv into the retool postgreSQL?

I have a large CSV which anyone can download here -

Form 5500 Datasets | U.S. Department of Labor.

How can I get one of these CSV datasets to a Retool PostgreSQL database? I attempted the import CSV tool which let me know the limit was 3mb (which seems small). The dataset I am trying to upload is a 62mb CSV. Not sure the simplest way to import that data.


1 Like

looking for the same but with self host retool

1 Like

Did you try using the file button component? It can automatically parse the CSV into JSON.

1 Like

Hi @Keith_K

I had a similar problem, I've solved relying on an additional service TinyBird, that allows to ingest a CSV and in a couple of minutes have a public endpoint I've consumed in Retool.

Hope this help.

1 Like

@abusedmedia Thanks for the suggestion!

There isn't currently a native way to do this, but we do have a ticket tracking this internally. We'll update this thread when there is any new information to share.

1 Like

Hi, I'm late to this thread. One way you can do this with Retool yourself is parsing the CSV file and use "Bulk Insert" functionality on Postgres Query GUI mode. It supports inserting a large number of rows at once.

1 Like

But why retool db gui doesnt allow more than 3mb upload csv? At least on self host should not have any limit

Yeah, ideally it would be able to handle bigger file size, but there may be some limit with parsing & handling big files on the server side (e.g. it being slow and blocking other queries). The 3mb may not be a hard limit but a safe limit for us currently, sadly.

I will flag it to the team so we can look into expanding it.

1 Like

Yeah at least for self host. Thats an issue because if i import csv via datagrip but cant see the db in retool Gui but query works good