I'm looking for best practices or at least common practices. I need to tell search engines to not index any sites on my (RHL) Apache development server.
I see multiple ways folks have done this, but specifically I have seen to add to the httpd.conf file:
#(method 1)
Header add X-Robots-Tag "noindex, nofollow"
Which does render the header I need, but I have also seen to add
#(method 2)
<Directory />
# Globally disallow robots from the development sever
Header Set X-Robots-Tag "noindex, nofollow"
</Directory
This also renders the header I need, (if both present this seems to overwrite the previous)
My question is, is there a significant difference between method 1 and method 2, and if so what is it?
If may also throw this sub question in too, is:
Header set "noindex, nofollow"
enough or should the header be:
Header add "noindex, nofollow, noarchive, nosnippet"
Finally
Header set ...
or
Header add ...
ether one worked, is there any difference?
Forgive if too many questions I'll split up if needed here - but they are all related to best/common practices in my mind.
https://developers.google.com/search/reference/robots_meta_tag#directives
Header set "noindex, nofollow" is fine.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.