Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Why does it show the Ultra Web Hosting home page when I go to my site?

There are several reasons for this. Ensure you do not have https at the beginning of your...

How can I change the default page / document?

Create a .htaccess file, include the below information into it and upload it to the public_html...

How to use SharedSSL

SharedSSL is now available with our Ultra Unlimited. It is not as professional looking as having...

301 Redirect

This code will redirect all visitors who attempt to access the old-url.html page to the...

Redirect http to https and www

To forward a website to use both www. and https:// use the following in an .htaccess file:...