Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 522 Users Found This Useful

Was this answer helpful?

Related Articles

What are these htaccess files that keep popping up?

.htaccess files are default files the server uses to perform certain tasks. Features such as...

View your website in older browser versions

As developers and webmasters it is always important to make sure your website works in several...

Error 508 / 503 - Resource Limit Reached

To help maintain the stability of servers and keep websites fast, UltraWebHosting has resource...

Troubleshooting Cloudflare errors

You may encounter errors using CloudFlare from time to time. This article may help to resolve...

How do I customize error messages for my site?

Within control panel select Custom Error Pages. Select the error code link and place whatever...