Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

html or htm parsed as shtml not working

To parse .html or .htm files as .shtml add the following to your .htaccess file in the...

Setting up Zend Framework

When setting up Zend Frameworks in your hosting account be sure to upload your frameworks folder...

htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable

To solve “.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable” follow...

Meta Redirect

Adding this code to your web page will redirect your visitors to any address that you have...

How do I cancel my account?

We are sorry to hear you would like to cancel your account! If there is anything we can do,...