Goal: Update the robots.txt
file to disallow crawling for the entire subdomain, not just the /embedded/
folder.
Currently, our custom robots.txt
file at apps.almond.re/robots.txt only disallows the /embedded/
subfolder. We want to modify it to prevent crawling of the entire subdomain.
To block crawling for the entire subdomain, update your robots.txt
file like this:
makefile
CopyEdit
User-agent: *
Disallow: /
This will prevent all search engines from indexing any part of apps.almond.re
.
Thanks for reaching out, @TECH_ALMOND,
We have a feature request for this in our backlog - I will reach out if our team picks up the request
Any updates on this one? pretty urgent as login pages are getting indexed on google
Hi @kity,
I don't have an update yet, but I'll note your +1 on the feature request. In the meantime, I believe some customers have requested to remove the URL through the Google Search Console temporarily.
Another option could be to deploy a self-hosted Retool instance (instead of Retool Cloud) within your virtual private cloud (VPC) or behind your virtual private network (VPN) to restrict access exclusively to users inside the network. This way, the browser would show a 404 error when someone outside the network tries to access it.