Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Prevent CloudFlare from Loading a js or Script

Sometimes CloudFlare / Rocket Loader can have problems with a script and hose it up. To have...

What is a Top-Level-Domain (TLD)?

TLD stands for Top Level Domain. This includes any domain names which suffix is .com, .net, .org,...

What is this Code 304 appearing in my stats?

304 is typically sent as a header if a visitor re-requests a document and the document has not...

How can I change the index listing in a directory?

You can change the initial page that is sent to client requests when a directory is accessed by...

How can I tell how much disk space I have available?

To see how much disk space you have available, go to your control panel and on the left side of...