Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Can I change my username?

Usernames are setup in the manner for security. Most programs will allow you to save the...

How do I remove a web disk?

In Windows, go to My Network Places on your PC, find the connection, right click and delete....

Enable AllowOverride

If you have an application that requires allowoverride enabled, create an .htaccess file and...

Prevent CloudFlare from Loading a js or Script

Sometimes CloudFlare / Rocket Loader can have problems with a script and hose it up. To have...

What are some tools are available to optimize my website?

There are many great tools to optimize your website. There is an optimize tool in your hosting...