Cache error robots.txt
-
- Posts: 3
- Joined: Tue Mar 24, 2020 2:01 pm
- Contact:
- Os: Debian 8x
- Web: apache + nginx
Cache error robots.txt
I'm trying to approve my adsense account, but Google adsense pops up the message that I'm blocking Google robot.
I clean the cache, and even then nothing changes, I disabled the nginx cache, and even then robots still appears.vestacp standard txt.
# vestacp autogenerated robots.txt
User-agent: *
Crawl-delay: 10
When I access without https, normal opens the file robots.txt
Does anyone know how to solve this problem?
I clean the cache, and even then nothing changes, I disabled the nginx cache, and even then robots still appears.vestacp standard txt.
# vestacp autogenerated robots.txt
User-agent: *
Crawl-delay: 10
When I access without https, normal opens the file robots.txt
Does anyone know how to solve this problem?