Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Howto: Add another website to your account

You can add another website to your hosting account if your hosting plan allows for more than one...

Point my domain to my wix account

How can I point my domain name to my wix.com account?To set this up you will want to login to...

Redirect Internet Explorer visitors to a supported browsers page

Redirect users of Internet Explorer to a supported browsers page or any other page.  To do this,...

Updating an Old Docker Version to Community Edition

The following was used to upgrade an antiquated version of docker to the newest community edition...

How do I create a helpdesk ticket?

You may create a helpdesk ticket at any time by visiting https://my.ultrawebhosting.com and...