Why do I have references to robots txt in my web stat programs?

Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.

  • 524 Users Found This Useful

Was this answer helpful?

Related Articles

Adding a Facebook Button

Facebook has a wonderful page on adding a Facebook image to your website complete with...

I updated my site but I still see old pages

Make sure your files are indeed updated by checking the file time stamps. Also, you may want to...

What perl modules are installed on the server?

You can get a complete list of the modules that are installed by going to the control panel and...

Why can users behind proxies not see my site?

If you have enabled hot link protection users behind proxies may not be able to see your site...

Why is the server load status red?

The server load status will be red if the server is under a little more stress than usual. This...