Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Mounting Samba Share in Linux for Startup

The following may be used to mount a samba share via your /etc/fstab file for bootup on...

Is it possible to make a cron to backup my database at specific times?

Yes, this can be very useful for forum/message board sites with large databases. In your control...

htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable

To solve “.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable” follow...

Can you write a script for me?

We do not write scripts but will be more than happy to guide you in the right direction. Most the...

Prevent CloudFlare from Loading a js or Script

Sometimes CloudFlare / Rocket Loader can have problems with a script and hose it up. To have...