Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Why does it show the Ultra Web Hosting home page when I go to my site?

There are several reasons for this. Ensure you do not have https at the beginning of your...

Can I setup SSL on my account?

Yes, you can purchase a secure certificate through us so you can use SSL. You can also give us...

Adding a Facebook Button

Facebook has a wonderful page on adding a Facebook image to your website complete with...

htaccess - Allow Let's Encrypt to Validate and Renew

When receiving the following Let's Encrypt error message:domain.com: The SSL certificate expires...

When should I use ascii and when should I use binary?

Many FTP client programs support and auto mode to detect the proper upload type based on file...