Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

How do I remove a web disk?

In Windows, go to My Network Places on your PC, find the connection, right click and delete....

Why is my account suspended?

Why is my account suspended? If you are receiving an account suspended page when trying to access...

Error 508 / 503 - Resource Limit Reached

To help maintain the stability of servers and keep websites fast, UltraWebHosting has resource...

When I upload an htaccess file it disappears

On unix/linux systems, files that start with a . are considered hidden so many ftp programs do...

How do I create a helpdesk ticket?

You may create a helpdesk ticket at any time by visiting https://my.ultrawebhosting.com and...