Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Set CORS header to Allow Access for any Incoming Domain

The following may be used to set to always set the CORS header for any incoming domains without...

Reseller: Unable to find an IP address in when creating an account

When creating accounts be sure to select a package in the package field. You can create a package...

What are all these htaccess files?

.htaccess files are used to perform certain rules such as displaying a 404.shtml file in replace...

Point Multiple Domains to the Same Website

Pointing other domains to your website is easy! Simply make sure their DNS settings are pointing...

How do I customize error messages for my site?

Within control panel select Custom Error Pages. Select the error code link and place whatever...