简体   繁体   English

禁止robots.txt中的所有分页页面

[英]Disallow all pagination pages in robots.txt

I don't want the google to crawl all my pagination: 我不希望Google抓取所有分页信息:

Here the are the example: 这里是示例:

http://sample.com/blog-page/page/1 http://sample.com/blog-page/page/1

http://sample.com/blog-page/page/2 http://sample.com/blog-page/page/2

http://sample.com/blog-page/page/3 http://sample.com/blog-page/page/3

Well I have some other pagination pages like: 好吧,我还有其他一些分页,例如:

http://sample.com/product-category/product/page/1 http://sample.com/product-category/product/page/1

http://sample.com/product-category/product/page/2 http://sample.com/product-category/product/page/2

http://sample.com/product-category/product/page/3 http://sample.com/product-category/product/page/3

Here are the code for my robots.txt 这是我的robots.txt的代码

User-agent:  *

Disallow: */page/*
User-agent:  *

Disallow: /blog-page/page/

You can use this to help you out with robots.txt generating 您可以使用它来帮助您生成robots.txt

http://tools.seobook.com/robots-txt/generator/ http://tools.seobook.com/robots-txt/generator/

I think it should be 我认为应该

User-agent: *
Disallow: /blog-page/page

The multiple astricks are not necessary and you have to make sure to include the whole path as /page would be the root of your site 不需要多个突击,并且您必须确保包括整个路径,因为/page将是您网站的根目录

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM