简体   繁体   中英

How to send streamed json data as a key value pair into kafka consumer

i write a jave code read a json data from local file system, and i want send that data as a key value pair's

public static void main(String[] args) throws IOException 
{
        Stream<String> objec = Files.lines(Paths.get("path\\data.json"));


                String topicName="test";

                Properties props=new Properties();
                props.put("kafka.bootstrap.servers", "localhost:9092,localhost:9093");
                props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
                props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");


                KafkaProducer<String,String> sampleProducer= new KafkaProducer<String,String>(props);
                objec.forEach(f->{
                ProducerRecord<String, String> record = new ProducerRecord<String, String>(topicName,f);        
                sampleProducer.send(record);
                });
                sampleProducer.close();

But when i run this program that will send a data to kafkaconsumer as a String, how could i send a json data as a key value pair to kafka consumer...

here the sample json file

{  
   "wifi_result":"1",
   "mic_result":"1",
   "video_result":"1",
   "touch_result":"1",
   "proximity_result":"1",
   "vibrator_result":"1",
   "power_key":"2",
   "accelerometer":"0",
   "earphone":"1",
   "memory_result":"1",
   "memory_internalSD":"1",
   "memory_internalSDSize":"25.0GB",
   "memory_externalSD":"0",
   "memory_externalSDSize":"",
   "memory_internalflash":"1",
   "memory_internalflashSize":"2.0GB",
   "vol_key_down":"0",
   "menu_key":"1",
   "headset_result":"1",

}

Help will be appreciate... Thanks in advance...

Read the json file as JSonObject instead of string, and then send it to Kafka topic. I am using gson library for parsing (as sample code) but you can choose any json parsing library of your choice.

import com.google.gson.Gson;
import com.google.gson.JsonObject;
import com.google.gson.stream.JsonReader;
import java.io.FileReader;

public class Main {

    static Gson gson = new Gson();

    public static JsonObject readJSON(String filePath) throws Exception {
     JsonReader reader = new JsonReader(new FileReader(filePath));
     return gson.fromJson(reader, JsonObject.class);
    }

    public static void main(String[] args) throws IOException {

     String topicName = "test";

     Properties props = new Properties();
     props.put("kafka.bootstrap.servers", "localhost:9092,localhost:9093");
     props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
     props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");


     KafkaProducer < String, String > sampleProducer = new KafkaProducer < String, String > (props);
     ProducerRecord < String, String > record = new ProducerRecord < String, String > (topicName, readJSON("data.json").toString());
     sampleProducer.send(record);
     sampleProducer.close();
    }
}

Also if just have to read the file & send it to topic as is, and not process any content. You can just read the whole file as String in one go & send it, rather than streaming line by line, this will preserve the json structure of the data:

    public static String readFileAsString(File file)
    throws IOException {
     InputStream fileInputStream = new FileInputStream(file);
     byte[] buffer = new byte[fileInputStream.available()];
     int length = fileInputStream.read(buffer);
     fileInputStream.close();
     return new String(buffer, 0, length);
    }

    ProducerRecord < String, String > record = new ProducerRecord < String, String > (topicName, readFileAsString(new File("data.json")));

UPDATES:

To pass the json file data as key value to the Kafka topic, you still have to parse the file as json object & then stream through json properties. Please check sample code below, I parse the json file as Map object using Jacksons, and then stream through its properties to send to topic one by one.

import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;

//read json file as map object
    private static Map<String, String> readJsonFileAsMap(File file) throws Exception{
        ObjectMapper mapper = new ObjectMapper();
        return mapper.readValue(file, new TypeReference<Map<String,String>>(){});
    }

//stream data as key value pair
        KafkaProducer<String,String> sampleProducer= new KafkaProducer<String,String>(props);
        readJsonFileAsMap(file).forEach((k,v)->{
            ProducerRecord<String, String> record = new ProducerRecord<String, String>("test",k,v);
            sampleProducer.send(record);
        });
        sampleProducer.close();

If you are using console consumer to verify the data make sure print.key=true , optionally you can add separator too key.separator=:

kafka-console-consumer --bootstrap-server localhost:9092 --topic test --from-beginning --property "print.key=true" -property "key.separator=:"

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM