Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

“Not Secure” Web Error

In this day and age browsers now default to using https:// instead of http://. We make SSL...

What is a domain?

A domain is a human-readable name assigned to an IP address to make accessing websites much...

Why can users behind proxies not see my site?

If you have enabled hot link protection users behind proxies may not be able to see your site...

What perl modules are installed on the server?

You can get a complete list of the modules that are installed by going to the control panel and...

Show longer file names in directory/folder index

If you've ever wanted to show longer file names in your directory listing just create or edit an...