chu*_*ull 13 elasticsearch elasticsearch-5
我想得到满足一定条件的群体数量.在SQL术语中,我想在Elasticsearch中执行以下操作.
SELECT COUNT(*) FROM
(
SELECT
senderResellerId,
SUM(requestAmountValue) AS t_amount
FROM
transactions
GROUP BY
senderResellerId
HAVING
t_amount > 10000 ) AS dum;
Run Code Online (Sandbox Code Playgroud)
到目前为止,我可以通过term Aggsel对senderResellerId进行分组.但是当我应用过滤器时,它不能按预期工作.
弹性请求
{
"aggregations": {
"reseller_sale_sum": {
"aggs": {
"sales": {
"aggregations": {
"reseller_sale": {
"sum": {
"field": "requestAmountValue"
}
}
},
"filter": {
"range": {
"reseller_sale": {
"gte": 10000
}
}
}
}
},
"terms": {
"field": "senderResellerId",
"order": {
"sales>reseller_sale": "desc"
},
"size": 5
}
}
},
"ext": {},
"query": { "match_all": {} },
"size": 0
}
Run Code Online (Sandbox Code Playgroud)
实际回应
{
"took" : 21,
"timed_out" : false,
"_shards" : {
"total" : 1,
"successful" : 1,
"failed" : 0
},
"hits" : {
"total" : 150824,
"max_score" : 0.0,
"hits" : [ ]
},
"aggregations" : {
"reseller_sale_sum" : {
"doc_count_error_upper_bound" : -1,
"sum_other_doc_count" : 149609,
"buckets" : [
{
"key" : "RES0000000004",
"doc_count" : 8,
"sales" : {
"doc_count" : 0,
"reseller_sale" : {
"value" : 0.0
}
}
},
{
"key" : "RES0000000005",
"doc_count" : 39,
"sales" : {
"doc_count" : 0,
"reseller_sale" : {
"value" : 0.0
}
}
},
{
"key" : "RES0000000006",
"doc_count" : 57,
"sales" : {
"doc_count" : 0,
"reseller_sale" : {
"value" : 0.0
}
}
},
{
"key" : "RES0000000007",
"doc_count" : 134,
"sales" : {
"doc_count" : 0,
"reseller_sale" : {
"value" : 0.0
}
}
}
}
}
]
}
}
}
Run Code Online (Sandbox Code Playgroud)
从上面的响应可以看出,它正在返回经销商,但reseller_sale聚合在结果中为零.
更多细节在这里.
Nik*_*iev 15
您可以使用其中一个pipeline aggregations,即桶选择器聚合.查询看起来像这样:
POST my_index/tdrs/_search
{
"aggregations": {
"reseller_sale_sum": {
"aggregations": {
"sales": {
"sum": {
"field": "requestAmountValue"
}
},
"max_sales": {
"bucket_selector": {
"buckets_path": {
"var1": "sales"
},
"script": "params.var1 > 10000"
}
}
},
"terms": {
"field": "senderResellerId",
"order": {
"sales": "desc"
},
"size": 5
}
}
},
"size": 0
}
Run Code Online (Sandbox Code Playgroud)
将以下文档放入索引后:
"hits": [
{
"_index": "my_index",
"_type": "tdrs",
"_id": "AV9Yh5F-dSw48Z0DWDys",
"_score": 1,
"_source": {
"requestAmountValue": 7000,
"senderResellerId": "ID_1"
}
},
{
"_index": "my_index",
"_type": "tdrs",
"_id": "AV9Yh684dSw48Z0DWDyt",
"_score": 1,
"_source": {
"requestAmountValue": 5000,
"senderResellerId": "ID_1"
}
},
{
"_index": "my_index",
"_type": "tdrs",
"_id": "AV9Yh8TBdSw48Z0DWDyu",
"_score": 1,
"_source": {
"requestAmountValue": 1000,
"senderResellerId": "ID_2"
}
}
]
Run Code Online (Sandbox Code Playgroud)
查询的结果是:
"aggregations": {
"reseller_sale_sum": {
"doc_count_error_upper_bound": 0,
"sum_other_doc_count": 0,
"buckets": [
{
"key": "ID_1",
"doc_count": 2,
"sales": {
"value": 12000
}
}
]
}
}
Run Code Online (Sandbox Code Playgroud)
即只有那些senderResellerId累计销售额的人>10000.
实现等效的SELECT COUNT(*) FROM (... HAVING)一个可以使用桶脚本聚合与总和桶聚合的组合.虽然似乎没有直接的方法来计算bucket_selector实际选择了多少桶,但我们可以定义一个bucket_script产生0或1取决于条件,并sum_bucket产生它sum:
POST my_index/tdrs/_search
{
"aggregations": {
"reseller_sale_sum": {
"aggregations": {
"sales": {
"sum": {
"field": "requestAmountValue"
}
},
"max_sales": {
"bucket_script": {
"buckets_path": {
"var1": "sales"
},
"script": "if (params.var1 > 10000) { 1 } else { 0 }"
}
}
},
"terms": {
"field": "senderResellerId",
"order": {
"sales": "desc"
}
}
},
"max_sales_stats": {
"sum_bucket": {
"buckets_path": "reseller_sale_sum>max_sales"
}
}
},
"size": 0
}
Run Code Online (Sandbox Code Playgroud)
输出将是:
"aggregations": {
"reseller_sale_sum": {
"doc_count_error_upper_bound": 0,
"sum_other_doc_count": 0,
"buckets": [
...
]
},
"max_sales_stats": {
"value": 1
}
}
Run Code Online (Sandbox Code Playgroud)
所需的存储桶数位于max_sales_stats.value.
我必须指出两件事:
管道聚合处理从其他聚合而不是文档集生成的输出,将信息添加到输出树.
这意味着bucket_selector聚合将在terms聚合结果之后和之后应用senderResellerId.例如,如果有更多的senderResellerId比size的terms聚集定义,你不会得到所有与集合中的ID sum(sales) > 10000,但只有那些出现在输出terms聚集.考虑使用排序和/或设置足够的size参数.
这也适用于第二种情况,COUNT() (... HAVING)它只计算实际存在于聚合输出中的那些存储桶.
如果此查询太重或存储桶数量太大,请考虑对数据进行非规范化或将此总和直接存储在文档中,以便您可以使用纯range查询来实现目标.
希望有所帮助!
| 归档时间: |
|
| 查看次数: |
10571 次 |
| 最近记录: |