简体   繁体   中英

Kafka java consumer uses too much memory

I am trying to populate kafka topic with some test data like this:

public static void main(String[] args){
    Properties properties = new Properties();
    properties.put("bootstrap.servers", "192.168.0.2:9092");
    properties.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
    properties.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");

    KafkaProducer<String, String> kafkaProducer = new KafkaProducer<String, String>(properties);
    try{
        for(int i = 0; i < 100; i++){
            System.out.println(i);
            kafkaProducer.send(new ProducerRecord<String, String>("my_topic", "{\"key\":\"my-json-1500-symblos-long\"}"));
        }
    }catch (Exception e){
        e.printStackTrace();
    }finally {
        kafkaProducer.close();
    }
}

And CPU process for this simple "program" takes about 3-6Gb of memory: 在此输入图像描述

And works extremelly slow - about 2-3 mintues for 1 message storing.

What is wrong with kafka? Why to send just a 1.5Kb message takes so much memory?

*Note Kafka run in docker container, java consumer in host machine.

UPDATED

If -Xmx1g is added as JVM opts I got:

java.lang.OutOfMemoryError: Java heap space
    at java.nio.HeapByteBuffer.<init>(HeapByteBuffer.java:57)

我可以得到实际的原因,但Windows重启解决了这个问题。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM