Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Enable AllowOverride

If you have an application that requires allowoverride enabled, create an .htaccess file and...

Error 508 / 503 - Resource Limit Reached

To help maintain the stability of servers and keep websites fast, UltraWebHosting has resource...

Error 406 unacceptable

This error can be caused by several factors. One cause can be the incorrect permissions are set...

Creating and Submitting a Google Sitemap

Creating a Google sitemap can be easy and beneficial. By using one, you are pushing a full list...

How do I create a helpdesk ticket?

You may create a helpdesk ticket at any time by visiting https://my.ultrawebhosting.com and...