Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Heartbleed Vulnerablity, Risk and Fix

The Heartbleed bug (http://en.wikipedia.org/wiki/Heartbleed_bug) is a serious vulnerability which...

Setting up Zend Framework

When setting up Zend Frameworks in your hosting account be sure to upload your frameworks folder...

Error 404 - File Not Found

Files are case sensitive. Make sure you are typing the address in correctly. Verify that the...

Create a 410 Redirect for Missing Files

A redirect which is often forgot about is the 410 redirect. This is search engine friendly in the...