Rumah > Soal Jawab > teks badan
1.查询20151216-17所有的名字
{
"from": 0,
"size": 200,
"query": {
"bool": {
"must": {
"range": {
"DATE": {
"from": 20151216,
"to": 2015121617,
"include_lower": true,
"include_upper": true
}
}
}
}
},
"_source": {
"includes": [
"NAME"
],
"excludes": []
}
2.查询20151217-18所有的名字
{
"from": 0,
"size": 200,
"query": {
"bool": {
"must": {
"range": {
"DATE": {
"from": 20151216,
"to": 2015121617,
"include_lower": true,
"include_upper": true
}
}
}
}
},
"_source": {
"includes": [
"NAME"
],
"excludes": []
}
如果在mysql中可用如下的语句求得这两天name的交集
SELECT NAME FROM Table1 where DATE between 20151216 and 20151217 interset SELECT NAME FROM Table1 where DATE between 20151217 and 20151218
Elastisearch中怎么做呢?
ringa_lee2017-04-17 16:00:47
SELECT NAME FROM Table1 where DATE between 20151216 and 20151217 interset SELECT NAME FROM Table1 where DATE between 20151217 and 20151218
我把楼主的SQL修改一下不知是否满足楼主的需求,如果满足我想下面的elasticsearch
应该也是可以使用的
SELECT NAME FROM Table1 where (DATE between 20151217 and 20151218) OR (DATE between 20151216 and 20151217)
GROUP BY NAME
REQUEST:
{
"query": {
"bool": {
"should":[
{
"range": {
"createdTime": {
"from": 1477984000,
"to": 1477984695
}
}
},
{
"range": {
"createdTime": {
"from": 1477984000,
"to": 1477984835
}
}
}
]
}
},
"size": 0,
"aggs": {
"my_price": {
"terms": {
"field": "price"
}
}
}
}
RESPONSE:
{
"took": 3,
"timed_out": false,
"_shards": {
"total": 5,
"successful": 5,
"failed": 0
},
"hits": {
"total": 542,
"max_score": 0,
"hits": []
},
"aggregations": {
"my_price": {
"doc_count_error_upper_bound": 8,
"sum_other_doc_count": 377,
"buckets": [
{
"key": 148,
"doc_count": 24
},
{
"key": 98,
"doc_count": 23
},
{
"key": 128,
"doc_count": 20
},
{
"key": 160,
"doc_count": 20
},
{
"key": 108,
"doc_count": 18
},
{
"key": 105,
"doc_count": 14
},
{
"key": 100,
"doc_count": 13
},
{
"key": 118,
"doc_count": 12
},
{
"key": 88,
"doc_count": 11
},
{
"key": 81,
"doc_count": 10
}
]
}
}
}
我这里是使用的数据创建时间,查询其价格,因为我的数据中只有价格是重复的,就采用这个做一个测试。
使用 aggs 对数据进行分组,"size":0
是禁止返回hits
数据,这样直接读取aggregations.my_price.buckets
属性就是你需要的数据集。