Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Prevent CloudFlare from Loading a js or Script

Sometimes CloudFlare / Rocket Loader can have problems with a script and hose it up. To have...

Meta Redirect

Adding this code to your web page will redirect your visitors to any address that you have...

osCommerce password reset

How to reset your osCommerce admin login... You can reset your osCommerce administrative login...

How do I parse html files as shtml?

Create an .htaccess file in the directory needed and add the following: AddHandler...

Disable error_log via htaccess

Prevent public display of PHP errors via htaccess# supress php errorsphp_flag...