Why do I have references to robots.txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.
  • 522 Users Found This Useful

Was this answer helpful?

Related Articles

How do I hide my domain WHOIS information?

If you do not wish to have your contact information shown when someone does a WHOIS report on...

What are all these .htaccess files?

.htaccess files are used to perform certain rules such as displaying a 404.shtml file in replace...

When I upload an .htaccess file it disappears.

On unix/linux systems, files that start with a . are considered hidden so many ftp programs do...

Unable to view Ultrawebhosting.com from my foreign browser.

We are an American company based out of Seattle, WA. Our website is written in English as a...

Domain Slamming - Beware

There are several companies out there using a predatory practice of attempting to have you renew...