Many search engines will look for a robots.txt file before spidering your sites content. The robots.txt file contains specific rules a robot/spider should abide by when capturing data.
- 524 Users Found This Useful
Related Articles
Why is my account suspended?
Why is my account suspended? If you are receiving an account suspended page when trying to access...
Enable AllowOverride
If you have an application that requires allowoverride enabled, create an .htaccess file and...
How can I turn off directory indexing?
In the directory that you wish you turn off directory indexing, you can do it under Index Manager...
301 Redirect
This code will redirect all visitors who attempt to access the old-url.html page to the...
Can you write a script for me?
We do not write scripts but will be more than happy to guide you in the right direction. Most the...
