I'd like to put robots.txt for the subdomain dev.
to point to a specific robots.txt static file.
What routing rule do I need?
This is what i'm thinking:
if ($host = "dev.example.com") {
location ^~ /robots.txt {
allow all;
root /static/disallow/robots.txt;
}
}
Based on Nginx subdomain configuration I believe I may need to just make separate server blocks and use include
's. If not a simple routing rule, is the includes method how this is typically done?
If you are wanting a specific robots.txt for each subdomain, you can do so with separate server blocks like this, which you allude (and link) to in your question:
server {
server_name subdomainA.domain.com;
include /common/config/path;
location = /robots.txt {
root /subdomainA/path;
}
}
server {
server_name subdomainB.domain.com;
include /common/config/path;
location = /robots.txt {
root /subdomainB/path;
}
}
Regarding your other approach, have you read If is Evil ?
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.