I'm a self-hosted non-developer and the hardest thing to get my head around has been Docker, not Retool. We're running and stable - knock on wood - but if we had serious trouble with the containers we'd be up-a-creek. There is zero Docker expertise in-house. We're excellent at Google (that's how we got this far), but at this points searches are turning up documentation that is written for the distributer (Retool) and not the user (me.)
All that said, it's time to update our on-prem instance to the latest version and I'm stuck on the backup procedures found here. We don't use a cloud platform or Git Synching, and don't manage our DB outside of Retool. I'd be satisfied with just making copies of the essential directories and "restoring" them if the update goes sideways, but I'm worried this will leave something unaccounted for.
Is anyone here in a similar situation? How do you manage and protect your self-hosted Retool?
Thanks in advance.
In this case, your best option might be to make copies of the deployment files locally as you mentioned, (i.e.
It's important to have consistent backups of your Postgres storage database as well as this is where the bulk of your Retool data is stored. Ideally, this should be external which makes it easier to run automated backups. There are some docs here on externalizing the database - is that something you've already explored and intentionally moved away from?
Do you have a full list of the files to backup? Also curious why not offer better solution as deployed vs. having to move the Postgress DB?
hey @Kabirdas! Sorry for the delayed response. We did end up externalizing our DB. It was surprisingly easy.
For the actual Docker and Retool files we're still a little uncertain, but the local IT just uses snapshots and other backup tools for the entire system. It might be hard to back out an upgrade, but we should be protected from disasters.
Glad things went well with externalizing your DB - that's the main thing that needs to be backed up.
If you want to go the extra mile you can have copies of your environment variables and deployment configuration files. That would be
docker-compose.yml respectively for docker deployments and depending on how you choose to deploy they may be stored elsewhere (e.g.
values.yaml in a Kubernetes deployment). Most everything about your instance (app saves, resource configurations, permissions, users, etc.) lives in the DB though.
Ideally, externalizing the database lets you manage backups and handle general maintenance of the DB in a way that works best for you while helping to decouple things in an important way. Hopefully this answers your question as well @KeithMacK, let me know if it doesn't!