Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

What is this Code 304 appearing in my stats?

304 is typically sent as a header if a visitor re-requests a document and the document has not...

Enable AllowOverride

If you have an application that requires allowoverride enabled, create an .htaccess file and...

Why can I not ping my domain?

For security purposes we have closed off the ICMP port which is used for ping.

Lightweight Browser for Older and Slower Computers

Are you looking for a faster up to date browser in this current age for your older computer on...

What are these htaccess files that keep popping up?

.htaccess files are default files the server uses to perform certain tasks. Features such as...