Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Error 406 unacceptable

This error can be caused by several factors. One cause can be the incorrect permissions are set...

Point Multiple Domains to the Same Website

Pointing other domains to your website is easy! Simply make sure their DNS settings are pointing...

View your website in older browser versions

As developers and webmasters it is always important to make sure your website works in several...

Sub-Domain times out

When you add a sub-domain and it times out make sure you do not have redirection set from that...

Prevent CloudFlare from Loading a js or Script

Sometimes CloudFlare / Rocket Loader can have problems with a script and hose it up. To have...