Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

How do I parse html files as shtml?

Create an .htaccess file in the directory needed and add the following: AddHandler...

Sub-Domain times out

When you add a sub-domain and it times out make sure you do not have redirection set from that...

SSL - Creating a CSR in Windows 2003

Creating the CSR IIS Windows 2003 or 2000 Server: From Administrative Tools, run the Internet...

I updated my site but I still see old pages

Make sure your files are indeed updated by checking the file time stamps. Also, you may want to...

htaccess referral redirect

It is often useful to redirect a visitor to a page based on the referring website. There are...