简体   繁体   中英

MongoDB search big data collection

I am trying search through big data collection of objects (1 000 000 000 elements). Sample element looks like this:

Document{{_id=588e6f317367651f34a06c2c, busId=34, time=1262305558050, createdDate=Sun Jan 29 23:39:42 CET 2017}}

there are busIds from 0 to 300 and time increment about 30 milisecond on each record begins from

SimpleDateFormat sdf = new SimpleDateFormat("yyyy.MM.dd HH:mm:ss"); long startDate = sdf.parse("2010.01.01 00:00:00").getTime();

Now I am looking for all data with this query:

    BasicDBObject gtQuery = new BasicDBObject();
    List<BasicDBObject> obj = new ArrayList<BasicDBObject>();
    obj.add(new BasicDBObject("busId", vehicleId));
    obj.add(new BasicDBObject("time", new BasicDBObject("$gt", startDate.getTime()).append("$lt", endDate.getTime())));
    gtQuery.put("$and", obj);
    System.out.println(gtQuery.toString());
    FindIterable<Document> curs = collection.find(gtQuery);

gtQuery output:

{ "$and" : [ { "busId" : "34"} , { "time" : { "$gt" : 1262304705000 , "$lt" : 1262308305000}}]}

Query is working but in this way it iterates over whole 1 000 000 000 elements in collection. Is there any way to do it faster?

尝试按照@ares的建议在busId和时间上创建复合索引

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM