簡體   English   中英

如何將流json數據作為鍵值對發送到kafka使用者

[英]How to send streamed json data as a key value pair into kafka consumer

我寫了一個jave代碼,從本地文件系統中讀取json數據,我想將該數據作為鍵值對的

public static void main(String[] args) throws IOException 
{
        Stream<String> objec = Files.lines(Paths.get("path\\data.json"));


                String topicName="test";

                Properties props=new Properties();
                props.put("kafka.bootstrap.servers", "localhost:9092,localhost:9093");
                props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
                props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");


                KafkaProducer<String,String> sampleProducer= new KafkaProducer<String,String>(props);
                objec.forEach(f->{
                ProducerRecord<String, String> record = new ProducerRecord<String, String>(topicName,f);        
                sampleProducer.send(record);
                });
                sampleProducer.close();

但是,當我運行此程序將以字符串形式將數據發送到kafkaconsumer時,如何將json數據作為鍵值對發送給kafka消費者...

這里是示例json文件

{  
   "wifi_result":"1",
   "mic_result":"1",
   "video_result":"1",
   "touch_result":"1",
   "proximity_result":"1",
   "vibrator_result":"1",
   "power_key":"2",
   "accelerometer":"0",
   "earphone":"1",
   "memory_result":"1",
   "memory_internalSD":"1",
   "memory_internalSDSize":"25.0GB",
   "memory_externalSD":"0",
   "memory_externalSDSize":"",
   "memory_internalflash":"1",
   "memory_internalflashSize":"2.0GB",
   "vol_key_down":"0",
   "menu_key":"1",
   "headset_result":"1",

}

幫助將不勝感激...預先感謝...

以JSonObject而不是字符串的形式讀取json文件,然后將其發送到Kafka主題。 我正在使用gson庫進行解析(作為示例代碼),但是您可以選擇自己選擇的任何json解析庫。

import com.google.gson.Gson;
import com.google.gson.JsonObject;
import com.google.gson.stream.JsonReader;
import java.io.FileReader;

public class Main {

    static Gson gson = new Gson();

    public static JsonObject readJSON(String filePath) throws Exception {
     JsonReader reader = new JsonReader(new FileReader(filePath));
     return gson.fromJson(reader, JsonObject.class);
    }

    public static void main(String[] args) throws IOException {

     String topicName = "test";

     Properties props = new Properties();
     props.put("kafka.bootstrap.servers", "localhost:9092,localhost:9093");
     props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
     props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");


     KafkaProducer < String, String > sampleProducer = new KafkaProducer < String, String > (props);
     ProducerRecord < String, String > record = new ProducerRecord < String, String > (topicName, readJSON("data.json").toString());
     sampleProducer.send(record);
     sampleProducer.close();
    }
}

另外,如果只需要讀取文件並將其按原樣發送到主題,而不處理任何內容。 您可以一次將整個文件讀取為String並發送,而不是逐行流式傳輸,這將保留數據的json結構:

    public static String readFileAsString(File file)
    throws IOException {
     InputStream fileInputStream = new FileInputStream(file);
     byte[] buffer = new byte[fileInputStream.available()];
     int length = fileInputStream.read(buffer);
     fileInputStream.close();
     return new String(buffer, 0, length);
    }

    ProducerRecord < String, String > record = new ProducerRecord < String, String > (topicName, readFileAsString(new File("data.json")));

更新:

要將json文件數據作為鍵值傳遞給Kafka主題,您仍然必須將文件解析為json對象,然后流過json屬性。 請檢查下面的示例代碼,我使用Jacksons將json文件解析為Map對象,然后通過其屬性進行流傳輸以逐一發送到主題。

import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;

//read json file as map object
    private static Map<String, String> readJsonFileAsMap(File file) throws Exception{
        ObjectMapper mapper = new ObjectMapper();
        return mapper.readValue(file, new TypeReference<Map<String,String>>(){});
    }

//stream data as key value pair
        KafkaProducer<String,String> sampleProducer= new KafkaProducer<String,String>(props);
        readJsonFileAsMap(file).forEach((k,v)->{
            ProducerRecord<String, String> record = new ProducerRecord<String, String>("test",k,v);
            sampleProducer.send(record);
        });
        sampleProducer.close();

如果使用控制台使用者來驗證數據,請確保print.key=true ,也可以選擇添加分隔符key.separator=:

kafka-console-consumer --bootstrap-server localhost:9092 --topic測試--from-beginning --property“ print.key = true” -property“ key.separator =:”

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM