Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Is it possible to make a cron to backup my database at specific times?

Yes, this can be very useful for forum/message board sites with large databases. In your control...

How can I change the index listing in a directory?

You can change the initial page that is sent to client requests when a directory is accessed by...

Redirect http to https and www

To forward a website to use both www. and https:// use the following in an .htaccess file:...

I can not move out of the / directory, why not?

For security reasons, you cannot move out of the root folder of your own domain.

How do I parse html files as shtml?

Create an .htaccess file in the directory needed and add the following: AddHandler...