Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

What is a domain?

A domain is a human-readable name assigned to an IP address to make accessing websites much...

htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable

To solve “.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable” follow...

Lightweight Browser for Older and Slower Computers

Are you looking for a faster up to date browser in this current age for your older computer on...

What is the location of curl?

Curl is located @ /usr/bin/curl

403 Error on POST

This can occur for several reasons: Be sure your file permissions are correct. If the file needs...