Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

When should I use ascii and when should I use binary?

Many FTP client programs support and auto mode to detect the proper upload type based on file...

How can I turn off directory indexing?

In the directory that you wish you turn off directory indexing, you can do it under Index Manager...

Enable Mod Rewrite

Create an .htaccess file with the following contents and upload it to your public_html directory....

Show longer file names in directory/folder index

If you've ever wanted to show longer file names in your directory listing just create or edit an...

Redirect http to https and www

To forward a website to use both www. and https:// use the following in an .htaccess file:...