Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Why is the server load status red?

The server load status will be red if the server is under a little more stress than usual. This...

301 Redirect

This code will redirect all visitors who attempt to access the old-url.html page to the...

Convert PDF to HTML Service

To convert your PDF files to html try using the following converter tool from Adobe:...

Why can users behind proxies not see my site?

If you have enabled hot link protection users behind proxies may not be able to see your site...

Creating a Form Button

Sometimes all you need is a simple button. The following pages are pretty useful tools for...