Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

I am unable to delete a file

This can occur for several reasons. With our service, if a directory of a script was found to be...

Your connection to this server has been blocked at the firewall

If you receive the message: "Your connection to this server has been blocked at the firewall. If...

What is this Code 304 appearing in my stats?

304 is typically sent as a header if a visitor re-requests a document and the document has not...

What are these htaccess files that keep popping up?

.htaccess files are default files the server uses to perform certain tasks. Features such as...

Point Multiple Domains to the Same Website

Pointing other domains to your website is easy! Simply make sure their DNS settings are pointing...