Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Why does it show the Ultra Web Hosting home page when I go to my site?

There are several reasons for this. Ensure you do not have https at the beginning of your...

My site was hacked

This typically occurs when you are running a script/app on your website which is out-dated and...

Can you write a script for me?

We do not write scripts but will be more than happy to guide you in the right direction. Most the...

https:// with Weebly

To enable and force https:// with your Weebly website, be sure to go into your editor area,...

Lightweight Browser for Older and Slower Computers

Are you looking for a faster up to date browser in this current age for your older computer on...