Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

My pages are updating/refreshing slowly

Refresh is immediate on the server. When a file is updated and uploaded it is there. If you are...

When I type in my domain it goes to ultrawebhosting.com

You may have the IP cached of the previous server on your system. Try a reboot. If you run WinNT,...

Redirect Internet Explorer visitors to a supported browsers page

Redirect users of Internet Explorer to a supported browsers page or any other page.  To do this,...

How do I submit my site to search engines?

Submit your website to popular search engines to increase your webite's exposure and visitor...

What is the location of curl?

Curl is located @ /usr/bin/curl