Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

How do I parse html files as shtml?

Create an .htaccess file in the directory needed and add the following: AddHandler...

Change your Websites Favorite Icon

How to change your website's icon for your internet browser. Create a 16x16 pixel icon image...

My site was hacked

This typically occurs when you are running a script/app on your website which is out-dated and...

Is it possible to make a cron to backup my database at specific times?

Yes, this can be very useful for forum/message board sites with large databases. In your control...

I need a module installed but I do not have rights

No problem. Send us a trouble ticket and we will install the module as long as there are no...