Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Cross Origin Request Error

When receiving a cross-origin / cross original request, CORS header 'Access-Control-Allow-Origin'...

How can I redirect http to https?

Create an .htaccess file with the following contents and upload it to your public_html directory....

“Not Secure” Web Error

In this day and age browsers now default to using https:// instead of http://. We make SSL...

vBulletin fsockopen() [function.fsockopen]: unable to connect

When setting up mail in vBulletin be sure you are using localhost for the mail server along with...

Why can I not ping my domain?

For security purposes we have closed off the ICMP port which is used for ping.