Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Why do I get emails for cron jobs?

This is the default. If you would like to disable receiving emails when a cron job runs append...

My index page does not load - Why not?

Filenames are case sensitive. Be sure you have an index filename all in lower case. Examples are...

Your connection to this server has been blocked at the firewall

If you receive the message: "Your connection to this server has been blocked at the firewall. If...

Why is my account suspended?

Why is my account suspended? If you are receiving an account suspended page when trying to access...

Updating an Old Docker Version to Community Edition

The following was used to upgrade an antiquated version of docker to the newest community edition...