Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Show longer file names in directory/folder index

If you've ever wanted to show longer file names in your directory listing just create or edit an...

Error 508 / 503 - Resource Limit Reached

To help maintain the stability of servers and keep websites fast, UltraWebHosting has resource...

Why is my account suspended?

Why is my account suspended? If you are receiving an account suspended page when trying to access...

Heartbleed Vulnerablity, Risk and Fix

The Heartbleed bug (http://en.wikipedia.org/wiki/Heartbleed_bug) is a serious vulnerability which...

Prevent CloudFlare from Loading a js or Script

Sometimes CloudFlare / Rocket Loader can have problems with a script and hose it up. To have...