Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

What are these htaccess files that keep popping up?

.htaccess files are default files the server uses to perform certain tasks. Features such as...

What perl modules are installed on the server?

You can get a complete list of the modules that are installed by going to the control panel and...

I have a reseller account. What can I use for billing software and automation?

There are many great programs that work for both billing, domain and cpanel integration. Check...

What is the location of curl?

Curl is located @ /usr/bin/curl

Show longer file names in directory/folder index

If you've ever wanted to show longer file names in your directory listing just create or edit an...