Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

htaccess referral redirect

It is often useful to redirect a visitor to a page based on the referring website. There are...

403 Error on POST

This can occur for several reasons: Be sure your file permissions are correct. If the file needs...

Unable to view Ultrawebhosting.com from my foreign browser

We are an American company based out of Seattle, WA. Our website is written in English as a...

How can I make search engine friendly urls with my app?

The best source is the support forums for the particular app. You can also check the following...