Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable

To solve “.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable” follow...

403 Error on POST

This can occur for several reasons: Be sure your file permissions are correct. If the file needs...

What are some tools are available to optimize my website?

There are many great tools to optimize your website. There is an optimize tool in your hosting...

How can I turn off directory indexing?

In the directory that you wish you turn off directory indexing, you can do it under Index Manager...

I updated my site but I still see old pages

Make sure your files are indeed updated by checking the file time stamps. Also, you may want to...