Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

When I upload an htaccess file it disappears

On unix/linux systems, files that start with a . are considered hidden so many ftp programs do...

Do you have a firewall?

Yes, our servers have firewalls. For security we close off unused ports and update the server...

Some visitors in China and Russia cannot reach my website

Some networks known for abusive activity such as hacking in countries such as China, North Korea...

How do I parse html files as shtml?

Create an .htaccess file in the directory needed and add the following: AddHandler...

When I type in my domain it goes to ultrawebhosting.com

You may have the IP cached of the previous server on your system. Try a reboot. If you run WinNT,...