Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Heartbleed Vulnerablity, Risk and Fix

The Heartbleed bug (http://en.wikipedia.org/wiki/Heartbleed_bug) is a serious vulnerability which...

What is a Top-Level-Domain (TLD)?

TLD stands for Top Level Domain. This includes any domain names which suffix is .com, .net, .org,...

Is it possible to make a cron to backup my database at specific times?

Yes, this can be very useful for forum/message board sites with large databases. In your control...

SSL - Creating a CSR in Windows 2003

Creating the CSR IIS Windows 2003 or 2000 Server: From Administrative Tools, run the Internet...

When I upload an htaccess file it disappears

On unix/linux systems, files that start with a . are considered hidden so many ftp programs do...