Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Why is the server load status red?

The server load status will be red if the server is under a little more stress than usual. This...

Meta Redirect

Adding this code to your web page will redirect your visitors to any address that you have...

I updated my site but I still see old pages

Make sure your files are indeed updated by checking the file time stamps. Also, you may want to...

Troubleshooting Cloudflare errors

You may encounter errors using CloudFlare from time to time. This article may help to resolve...

Error 401 Unauthorized

This can occur when a web page requires authorization or a login to view the contents. If none...