Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

How can I tell how much disk space I have available?

To see how much disk space you have available, go to your control panel and on the left side of...

Prevent CloudFlare from Loading a js or Script

Sometimes CloudFlare / Rocket Loader can have problems with a script and hose it up. To have...

How to create a temporary 302 redirect via htaccess

An easy way to create a temporary redirect from one page to another web page is to use the 302...

Your connection to this server has been blocked at the firewall

If you receive the message: "Your connection to this server has been blocked at the firewall. If...

Redirect http to https and www

To forward a website to use both www. and https:// use the following in an .htaccess file:...