Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

How can I redirect http to https?

Create an .htaccess file with the following contents and upload it to your public_html directory....

How do I hide my domain WHOIS information?

If you do not wish to have your contact information shown when someone does a WHOIS report on...

How can I make search engine friendly urls with my app?

The best source is the support forums for the particular app. You can also check the following...

How do I cancel my account?

We are sorry to hear you would like to cancel your account! If there is anything we can do,...

Troubleshooting Cloudflare errors

You may encounter errors using CloudFlare from time to time. This article may help to resolve...