Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Enable AllowOverride

If you have an application that requires allowoverride enabled, create an .htaccess file and...

Mounting Samba Share in Linux for Startup

The following may be used to mount a samba share via your /etc/fstab file for bootup on...

How can I change the index listing in a directory?

You can change the initial page that is sent to client requests when a directory is accessed by...

Troubleshooting Cloudflare errors

You may encounter errors using CloudFlare from time to time. This article may help to resolve...

html or htm parsed as shtml not working

To parse .html or .htm files as .shtml add the following to your .htaccess file in the...