Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

https:// with Weebly

To enable and force https:// with your Weebly website, be sure to go into your editor area,...

Why do I get emails for cron jobs?

This is the default. If you would like to disable receiving emails when a cron job runs append...

Sub-Domain times out

When you add a sub-domain and it times out make sure you do not have redirection set from that...

Create a 410 Redirect for Missing Files

A redirect which is often forgot about is the 410 redirect. This is search engine friendly in the...

I can not move out of the / directory, why not?

For security reasons, you cannot move out of the root folder of your own domain.