[英]Filter based on different values for the same field in different documents
Let's say I have the following data:假设我有以下数据:
{
"id":"1",
"name": "John",
"tag":"x"
},
{
"id": 2,
"name":"John",
"tag":"y"
},
{
"id": 3,
"name":"Jane",
"tag":"x"
}
I want to get the count of documents (unique on name) that has both tag = "x" and tag = "y"我想获取同时具有 tag = "x"和tag = "y" 的文档数(名称唯一)
Given the above data, the query should return 1, because only John has two documents exists that has the two required tags.给定以上数据,查询应该返回 1,因为只有 John 存在两个具有两个必需标签的文档。
What I am able to do so far is a query that uses OR ( so either tag = "x" or tag = "y") which will return 2. For example:到目前为止,我能够做的是使用 OR(因此 tag = "x" 或 tag = "y")的查询,它将返回 2。例如:
"aggs": {
"distict_count": {
"filter": {
"terms": {
"tag": [
"x",
"y"
]
}
},
"aggs": {
"agg_cardinality_name": {
"cardinality": {
"field": "name"
}
}
}
}
}
Would it be possible to change that to use and instead of or ?是否可以将其更改为使用and而不是or ?
Try putting cardinality
under a terms agg to get proper distinct counts:尝试将cardinality
放在术语 agg 下以获得适当的不同计数:
{
"size": 0,
"aggs": {
"distict_count": {
"filter": {
"terms": {
"tag": [
"x",
"y"
]
}
},
"aggs": {
"agg_terms": {
"terms": {
"field": "name"
},
"aggs": {
"agg_cardinality_name": {
"cardinality": {
"field": "name"
}
}
}
}
}
}
}
}
CORRECTION更正
You can use a combination of cardinality
aggs with a bucket_selector
which'll rule out buckets where there are fewer than 2 unique tags -- ie both x and y:您可以将cardinality
aggs 与bucket_selector
结合使用,这将排除唯一标签少于 2 个的桶——即 x和y:
{
"size": 0,
"aggs": {
"distict_count": {
"filter": {
"terms": {
"tag": [
"x",
"y"
]
}
},
"aggs": {
"agg_terms": {
"terms": {
"field": "name"
},
"aggs": {
"agg_cardinality_tag2": {
"bucket_selector": {
"buckets_path": {
"unique_tags_count": "unique_tags_count"
},
"script": "params.unique_tags_count > 1"
}
},
"unique_tags_count": {
"cardinality": {
"field": "tag"
}
},
"unique_names_count": {
"cardinality": {
"field": "name"
}
}
}
}
}
}
}
}
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.