Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Some visitors in China and Russia cannot reach my website

Some networks known for abusive activity such as hacking in countries such as China, North Korea...

What are these vt directories?

Those are FrontPage extension directories. If you have and no longer need them you can uninstall...

Why do I get emails for cron jobs?

This is the default. If you would like to disable receiving emails when a cron job runs append...

Redirect http to https and www

To forward a website to use both www. and https:// use the following in an .htaccess file:...

Error 404 - File Not Found

Files are case sensitive. Make sure you are typing the address in correctly. Verify that the...