[英]How to send streamed json data as a key value pair into kafka consumer
i write a jave code read a json data from local file system, and i want send that data as a key value pair's 我写了一个jave代码,从本地文件系统中读取json数据,我想将该数据作为键值对的
public static void main(String[] args) throws IOException
{
Stream<String> objec = Files.lines(Paths.get("path\\data.json"));
String topicName="test";
Properties props=new Properties();
props.put("kafka.bootstrap.servers", "localhost:9092,localhost:9093");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
KafkaProducer<String,String> sampleProducer= new KafkaProducer<String,String>(props);
objec.forEach(f->{
ProducerRecord<String, String> record = new ProducerRecord<String, String>(topicName,f);
sampleProducer.send(record);
});
sampleProducer.close();
But when i run this program that will send a data to kafkaconsumer as a String, how could i send a json data as a key value pair to kafka consumer... 但是,当我运行此程序将以字符串形式将数据发送到kafkaconsumer时,如何将json数据作为键值对发送给kafka消费者...
here the sample json file 这里是示例json文件
{
"wifi_result":"1",
"mic_result":"1",
"video_result":"1",
"touch_result":"1",
"proximity_result":"1",
"vibrator_result":"1",
"power_key":"2",
"accelerometer":"0",
"earphone":"1",
"memory_result":"1",
"memory_internalSD":"1",
"memory_internalSDSize":"25.0GB",
"memory_externalSD":"0",
"memory_externalSDSize":"",
"memory_internalflash":"1",
"memory_internalflashSize":"2.0GB",
"vol_key_down":"0",
"menu_key":"1",
"headset_result":"1",
}
Help will be appreciate... Thanks in advance... 帮助将不胜感激...预先感谢...
Read the json file as JSonObject instead of string, and then send it to Kafka topic. 以JSonObject而不是字符串的形式读取json文件,然后将其发送到Kafka主题。 I am using gson library for parsing (as sample code) but you can choose any json parsing library of your choice. 我正在使用gson库进行解析(作为示例代码),但是您可以选择自己选择的任何json解析库。
import com.google.gson.Gson;
import com.google.gson.JsonObject;
import com.google.gson.stream.JsonReader;
import java.io.FileReader;
public class Main {
static Gson gson = new Gson();
public static JsonObject readJSON(String filePath) throws Exception {
JsonReader reader = new JsonReader(new FileReader(filePath));
return gson.fromJson(reader, JsonObject.class);
}
public static void main(String[] args) throws IOException {
String topicName = "test";
Properties props = new Properties();
props.put("kafka.bootstrap.servers", "localhost:9092,localhost:9093");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
KafkaProducer < String, String > sampleProducer = new KafkaProducer < String, String > (props);
ProducerRecord < String, String > record = new ProducerRecord < String, String > (topicName, readJSON("data.json").toString());
sampleProducer.send(record);
sampleProducer.close();
}
}
Also if just have to read the file & send it to topic as is, and not process any content. 另外,如果只需要读取文件并将其按原样发送到主题,而不处理任何内容。 You can just read the whole file as String in one go & send it, rather than streaming line by line, this will preserve the json structure of the data: 您可以一次将整个文件读取为String并发送,而不是逐行流式传输,这将保留数据的json结构:
public static String readFileAsString(File file)
throws IOException {
InputStream fileInputStream = new FileInputStream(file);
byte[] buffer = new byte[fileInputStream.available()];
int length = fileInputStream.read(buffer);
fileInputStream.close();
return new String(buffer, 0, length);
}
ProducerRecord < String, String > record = new ProducerRecord < String, String > (topicName, readFileAsString(new File("data.json")));
UPDATES: 更新:
To pass the json file data as key value to the Kafka topic, you still have to parse the file as json object & then stream through json properties. 要将json文件数据作为键值传递给Kafka主题,您仍然必须将文件解析为json对象,然后流过json属性。 Please check sample code below, I parse the json file as Map object using Jacksons, and then stream through its properties to send to topic one by one. 请检查下面的示例代码,我使用Jacksons将json文件解析为Map对象,然后通过其属性进行流传输以逐一发送到主题。
import com.fasterxml.jackson.core.type.TypeReference;
import com.fasterxml.jackson.databind.ObjectMapper;
//read json file as map object
private static Map<String, String> readJsonFileAsMap(File file) throws Exception{
ObjectMapper mapper = new ObjectMapper();
return mapper.readValue(file, new TypeReference<Map<String,String>>(){});
}
//stream data as key value pair
KafkaProducer<String,String> sampleProducer= new KafkaProducer<String,String>(props);
readJsonFileAsMap(file).forEach((k,v)->{
ProducerRecord<String, String> record = new ProducerRecord<String, String>("test",k,v);
sampleProducer.send(record);
});
sampleProducer.close();
If you are using console consumer to verify the data make sure print.key=true
, optionally you can add separator too key.separator=:
如果使用控制台使用者来验证数据,请确保print.key=true
,也可以选择添加分隔符key.separator=:
kafka-console-consumer --bootstrap-server localhost:9092 --topic test --from-beginning --property "print.key=true" -property "key.separator=:" kafka-console-consumer --bootstrap-server localhost:9092 --topic测试--from-beginning --property“ print.key = true” -property“ key.separator =:”
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.