简体   繁体   中英

elasticsearch bulk insert exception while uploading

I am getting an exception when trying to bulk insert into elasticsearch v6.6.0 .

It was fine earlier.

Here's the exception message:

Elasticsearch bulk insert exception, TransportError(503, 'circuit_breaking_exception', '[parent] Data too large, data for [<http_request>] would be [746384154/711.8mb], which is larger than the limit of [745517875/710.9mb], usages [request=0/0b, fielddata=626151547/597.1mb, in_flight_requests=889937/869kb, accounting=119342670/113.8mb]')

How do I configure for the data limit? since the exception is stating data too large, than the limit ?

the circuit breaker is a mechanism to prevent OutOfMemory exception. It sets to 70% percent of your heap. https://www.elastic.co/guide/en/elasticsearch/reference/current/circuit-breaker.html

I don't recommend to increase the limit of the circuit breaker, because you will get OOM exception. Obviously, the field data caused your heap (memory) being full. some solution:

  • increase the heap size.
  • use doc value instead of field data.
  • clear field data cache(for example every hour).

in some version of elastic ( I think 6.3) there was a bug about circuit breaker and after update, the problem was resolved.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM