Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

How to create a temporary 302 redirect via htaccess

An easy way to create a temporary redirect from one page to another web page is to use the 302...

Can I setup SSL on my account?

Yes, you can purchase a secure certificate through us so you can use SSL. You can also give us...

Whitelisting Our Support System Email Address

Whitelist Instructions Because we use often use html emails with http addresses in them, our...

How do I parse html files as shtml?

Create an .htaccess file in the directory needed and add the following: AddHandler...

https:// with Weebly

To enable and force https:// with your Weebly website, be sure to go into your editor area,...