I'm new to Elasticsearch Java API. I know that there are 2 ways to operate bulk:
construct a bulk request, use client object.
construct a bulk processor, add request to it.
I simulated a large batch of mock data(about 1M pieces), and index them into Elasticsearch(5.6.3) with Java high level rest client.
However,if I use bulk request to index a large batch ,
java.lang.OutOfMemoryError: nullexception
happens when I use client.bulk() method.
Then I try to use bulk processor, and it works. Here is the code:
RestHighLevelClient client = initESclient();
BulkProcessor bulkProcessor = initES(client);
logger.info("Use bulk request to load data:");
logger.info("start to generate random data...");
String bossMockIndex = customSetting.getMockIndex();
String soapMockType = customSetting.getSoapType();
Long start = System.currentTimeMillis();
Integer batch = customSetting.getMockBatch();
logger.info("Batch:"+batch);
List<IndexRequest> indexRequesList = bossMockDataService.indexRequestGenerator(batch, bossMockIndex, soapMockType);
Long endCreateData = System.currentTimeMillis();
logger.info("Consumption for creating "+batch+" pieces of mock data:"+(endCreateData-start)/1000.0d+"s");
for (IndexRequest indexRequest : indexRequesList) {
bulkProcessor.add(indexRequest);
}
What I wonder is how can I get this bulk response with BulkProcessor
just like when I use BulkRequest
.I need to update some data in the batch of data.
BulkResponse bulkResponse = bulkProcessor.execute().actionGet();
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.