Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

How do I cancel my account?

We are sorry to hear you would like to cancel your account! If there is anything we can do,...

How to use SharedSSL

SharedSSL is now available with our Ultra Unlimited. It is not as professional looking as having...

What is a domain?

A domain is a human-readable name assigned to an IP address to make accessing websites much...

Convert PDF to HTML Service

To convert your PDF files to html try using the following converter tool from Adobe:...

How can I turn off directory indexing?

In the directory that you wish you turn off directory indexing, you can do it under Index Manager...