Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.
- 524 Users Found This Useful
Related Articles
I need a module installed but I do not have rights
No problem. Send us a trouble ticket and we will install the module as long as there are no...
Why do I get emails for cron jobs?
This is the default. If you would like to disable receiving emails when a cron job runs append...
Troubleshooting Cloudflare errors
You may encounter errors using CloudFlare from time to time. This article may help to resolve...
Creating and Submitting a Google Sitemap
Creating a Google sitemap can be easy and beneficial. By using one, you are pushing a full list...
What are these htaccess files that keep popping up?
.htaccess files are default files the server uses to perform certain tasks. Features such as...
