Why do I have references to robots.txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.
  • 522 Users Found This Useful

Was this answer helpful?

Related Articles

How to use SharedSSL

SharedSSL is now available with our Ultra Unlimited. It is not as professional looking as having...

Why is the server load status red?

The server load status will be red if the server is under a little more stress than usual. This...

Meta Redirect

Adding this code to your web page will redirect your visitors to any address that you have...

Heartbleed Vulnerablity, Risk and Fix

The Heartbleed bug (http://en.wikipedia.org/wiki/Heartbleed_bug) is a serious vulnerability which...

Change your Websites Favorite Icon

How to change your website's icon for your internet browser. Create a 16x16 pixel icon image...