Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

What is the location of curl?

Curl is located @ /usr/bin/curl

Set CORS header to Allow Access for any Incoming Domain

The following may be used to set to always set the CORS header for any incoming domains without...

When I type in my domain it goes to ultrawebhosting.com

You may have the IP cached of the previous server on your system. Try a reboot. If you run WinNT,...

Show longer file names in directory/folder index

If you've ever wanted to show longer file names in your directory listing just create or edit an...

Troubleshooting Cloudflare errors

You may encounter errors using CloudFlare from time to time. This article may help to resolve...