Results 1 to 3 of 3
09-28-2004, 08:18 AM #1
I have a site that is redirected to several domain names whereas no searcher can track them down as existant. They told me I should make a robots.txt file but all the explanations I find are to make it to tell the searcher which pages to crawl nothing to do with several domain names.
Any help on this?
10-04-2004, 09:01 AM #2
The Web Robots Page at http://www.robotstxt.org/wc/robots.html can provide you with the information you seek. To successfully filter multiple domains, you'll need to place the robots.txt at the root of the domain for each site.
The file would include two simple lines.
User-agent: * Disallow: /
You can also use META tags on your pages, etc. These examples are covered in the site I posted above.
Hope this helps!
11-27-2004, 03:09 PM #3
If I am understanding your problem correctly, you can also use the .htaccess file with 301 redirects to handle this.
[Admin: Link Removed]
Last edited by QuietDean; 11-27-2004 at 04:16 PM. Reason: Remove link