Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Domain Slamming - Beware

There are several companies out there using a predatory practice of attempting to have you renew...

How do I remove a web disk?

In Windows, go to My Network Places on your PC, find the connection, right click and delete....

Redirect Internet Explorer visitors to a supported browsers page

Redirect users of Internet Explorer to a supported browsers page or any other page.  To do this,...

403 Error on POST

This can occur for several reasons: Be sure your file permissions are correct. If the file needs...

Show longer file names in directory/folder index

If you've ever wanted to show longer file names in your directory listing just create or edit an...