Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

I need a module installed but I do not have rights

No problem. Send us a trouble ticket and we will install the module as long as there are no...

Enable Mod Rewrite

Create an .htaccess file with the following contents and upload it to your public_html directory....

What is the location of curl?

Curl is located @ /usr/bin/curl

Error 406 unacceptable

This error can be caused by several factors. One cause can be the incorrect permissions are set...

How can I change the index listing in a directory?

You can change the initial page that is sent to client requests when a directory is accessed by...