Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Why does it show the Ultra Web Hosting home page when I go to my site?

There are several reasons for this. Ensure you do not have https at the beginning of your...

Why can I not ping my domain?

For security purposes we have closed off the ICMP port which is used for ping.

I am unable to delete a file

This can occur for several reasons. With our service, if a directory of a script was found to be...

Whitelisting Our Support System Email Address

Whitelist Instructions Because we use often use html emails with http addresses in them, our...

How can I make search engine friendly urls with my app?

The best source is the support forums for the particular app. You can also check the following...