Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Error 508 / 503 - Resource Limit Reached

To help maintain the stability of servers and keep websites fast, UltraWebHosting has resource...

Adding a Facebook Button

Facebook has a wonderful page on adding a Facebook image to your website complete with...

Error 401 Unauthorized

This can occur when a web page requires authorization or a login to view the contents. If none...

I can not move out of the / directory, why not?

For security reasons, you cannot move out of the root folder of your own domain.

Why do I get emails for cron jobs?

This is the default. If you would like to disable receiving emails when a cron job runs append...