Why do I have references to robots.txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.
  • 522 Users Found This Useful

Was this answer helpful?

Related Articles

I updated my site but I still see old pages.

Make sure your files are indeed updated by checking the file time stamps. Also, you may want to...

What is a Top-Level-Domain (TLD)?

TLD stands for Top Level Domain. This includes any domain names which suffix is .com, .net, .org,...

Can you write a script for me?

We do not write scripts but will be more than happy to guide you in the right direction. Most the...

Creating a Form Button

Sometimes all you need is a simple button. The following pages are pretty useful tools for...

Why is the server load status red?

The server load status will be red if the server is under a little more stress than usual. This...