简体   繁体   中英

Specify robots.txt using IIS for different sub-domains

I have a site at say www.example.com and also have two staging platforms at beta.example.com and preview.example.com and I need a way to set different robots.txt for each using IIS or something similar

The reason for this is that I want to disallow spiders on all but the www domain as they are spidering duplicated content

Anyone know if this is possible?

Is there a reason you can't just have 3 different robots.txt files, one for each host?

If you need to handle this automatically, I would suggest a HTTPHandler that handles requests for robots.txt. If the Request.Url.Host is www.example.com, return robots.allow.txt, if not, return robots.deny.txt.

If you are interested in this idea and need some example code to get you started, let me know.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM