I'm new and testing Retool for an enterprise solution for our small team. We currently use Google Enterprise Shared drives with several sheets that contain over 60,000 records each from different sources that need to be combined in a string of queries.
I'm trying the Retool Database and I uploaded one CSV and displayed 50,000 of the records and showed the total. Fine.
However, when I uploaded the second CSV, I received an error message that my storage limit was reached with over 50K records and to contact support.
I thought I'd float the question here as I'm trying to get some traction and direction over the quiet weekend. Any advice will be appreciated.
It looks like we have a limit of 50,000 records or 1 GB, unfortunately
Is this limit currently blocking your use case?
Yes the 50k records block is an issue, as we have multiple datasets of this size that join for specific results by co-op member. We’re trying multiple other sourcing houses to get around this but don’t have a diverse experience dealing with the nuances of establishing and updating them efficiently.
It’s looking like a combination of Google Sheets and Big Query are what can work, but now figuring out how to get the CRUD layers to function on a basic level to see if we can make Retool work for us. Looking promising.
Is there a Retool expert we could engage to verify our approach and/or to help us set up our environment to function well?
I'd definitely be happy to help sanity check or troubleshoot any errors! Feel free to create as many new forum posts as you'd like.
Otherwise, we have a paid way to connect with Retool builders here!
Update! This shouldn't be an issue going forward. Cloud will have 5GB free for one year (will continue to have a free tier after, TBD on specifics of amount and pricing), and On-Premise can host however much you'd like to in your own Postgres Retool DB (containerized or external) on
2.113. Thank you for your feedback and patience here