Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Why can users behind proxies not see my site?

If you have enabled hot link protection users behind proxies may not be able to see your site...

When I type in my domain it goes to ultrawebhosting.com

You may have the IP cached of the previous server on your system. Try a reboot. If you run WinNT,...

Change your Websites Favorite Icon

How to change your website's icon for your internet browser. Create a 16x16 pixel icon image...

Heartbleed Vulnerablity, Risk and Fix

The Heartbleed bug (http://en.wikipedia.org/wiki/Heartbleed_bug) is a serious vulnerability which...

My pages are updating/refreshing slowly

Refresh is immediate on the server. When a file is updated and uploaded it is there. If you are...