Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Point Multiple Domains to the Same Website

Pointing other domains to your website is easy! Simply make sure their DNS settings are pointing...

Prevent CloudFlare from Loading a js or Script

Sometimes CloudFlare / Rocket Loader can have problems with a script and hose it up. To have...

How do I parse html files as shtml?

Create an .htaccess file in the directory needed and add the following: AddHandler...

Why is my account suspended?

Why is my account suspended? If you are receiving an account suspended page when trying to access...

Error 401 Unauthorized

This can occur when a web page requires authorization or a login to view the contents. If none...