Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

I am unable to delete a file

This can occur for several reasons. With our service, if a directory of a script was found to be...

Why can I not ping my domain?

For security purposes we have closed off the ICMP port which is used for ping.

How to create a temporary 302 redirect via htaccess

An easy way to create a temporary redirect from one page to another web page is to use the 302...

Can you write a script for me?

We do not write scripts but will be more than happy to guide you in the right direction. Most the...

When I upload an htaccess file it disappears

On unix/linux systems, files that start with a . are considered hidden so many ftp programs do...