Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Adding a Facebook Button

Facebook has a wonderful page on adding a Facebook image to your website complete with...

I am unable to delete a file

This can occur for several reasons. With our service, if a directory of a script was found to be...

I can not move out of the / directory, why not?

For security reasons, you cannot move out of the root folder of your own domain.

Error 404 - File Not Found

Files are case sensitive. Make sure you are typing the address in correctly. Verify that the...

403 Error on POST

This can occur for several reasons: Be sure your file permissions are correct. If the file needs...