Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Convert PDF to HTML Service

To convert your PDF files to html try using the following converter tool from Adobe:...

How do I parse html files as shtml?

Create an .htaccess file in the directory needed and add the following: AddHandler...

vBulletin fsockopen() [function.fsockopen]: unable to connect

When setting up mail in vBulletin be sure you are using localhost for the mail server along with...

Redirect Internet Explorer visitors to a supported browsers page

Redirect users of Internet Explorer to a supported browsers page or any other page.  To do this,...

Password Protect Directories

Within control panel select Password Protection. Select the directory you would like to protect....