Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

How can I change the index listing in a directory?

You can change the initial page that is sent to client requests when a directory is accessed by...

Your connection to this server has been blocked at the firewall

If you receive the message: "Your connection to this server has been blocked at the firewall. If...

Redirect Internet Explorer visitors to a supported browsers page

Redirect users of Internet Explorer to a supported browsers page or any other page.  To do this,...

Password Protect Directories

Within control panel select Password Protection. Select the directory you would like to protect....

Why is my account suspended?

Why is my account suspended? If you are receiving an account suspended page when trying to access...