Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

htaccess referral redirect

It is often useful to redirect a visitor to a page based on the referring website. There are...

Inode issues

There are many issues that can arise from reaching your inode limit. The inode limit is the total...

Common Search Engine Optimization Tips

Get other websites to link to you. Have your website linked and posted in socal media...

Point my domain to my wix account

How can I point my domain name to my wix.com account?To set this up you will want to login to...

htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable

To solve “.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable” follow...