简体   繁体   中英

disable crawling subdomain google crawler

i would like to know how i can disallow google the crawling of my subdomains ?

i made a pic of my webspace folder. the awesom media folder is the folder where the main site www.awesom-media.de is. folders

the other once are subdomains. what i whant is that google should not crawl this one but i dont know how .

i dont have a robot.txt in the awesom media folder but as u can see in the / part. and the content of the robot.txt is User-agent: * Disallow:

and thats it.

how can i tell google not to crawl the subdomains

In case all your subdomains directly route to the specific folders (eg something like automagazin.awesom-media.de uses the folder auto-magazin ), just place a robots.txt with

User-agent: *
Disallow: /

in all your folders for the subdomains you want to disallow for Google. I guess these are auto-magazin and future-magazin (and maybe more).

Currently you put it into the root folder, which Google probably cannot see at all. Just try to load [subdomain].awesom-media.de/robots.txt and see if it loads a robot.txt or not.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM