Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Inode issues

There are many issues that can arise from reaching your inode limit. The inode limit is the total...

Do you support TeamSpeak?

We do not support Teamspeak on our servers but allow you to create a subdomain such as...

Unable to view Ultrawebhosting.com from my foreign browser

We are an American company based out of Seattle, WA. Our website is written in English as a...

How can I turn off directory indexing?

In the directory that you wish you turn off directory indexing, you can do it under Index Manager...

Your connection to this server has been blocked at the firewall

If you receive the message: "Your connection to this server has been blocked at the firewall. If...