We are happy to announce that Vesta is back under active development as of 25 February 2024. We are working on v1 candidate and expect to engage more with the community over the coming months. We are committed to open source, and we encourage contributors to help us build the future of Vesta.
Cache error robots.txt
-
- Posts: 3
- Joined: Tue Mar 24, 2020 2:01 pm
- Contact:
- Os: Debian 8x
- Web: apache + nginx
Cache error robots.txt
I'm trying to approve my adsense account, but Google adsense pops up the message that I'm blocking Google robot.
I clean the cache, and even then nothing changes, I disabled the nginx cache, and even then robots still appears.vestacp standard txt.
# vestacp autogenerated robots.txt
User-agent: *
Crawl-delay: 10
When I access without https, normal opens the file robots.txt
Does anyone know how to solve this problem?
I clean the cache, and even then nothing changes, I disabled the nginx cache, and even then robots still appears.vestacp standard txt.
# vestacp autogenerated robots.txt
User-agent: *
Crawl-delay: 10
When I access without https, normal opens the file robots.txt
Does anyone know how to solve this problem?