简体   繁体   English

使用Kafka消耗spark RDD

[英]Consuming spark RDD using Kafka

Are there any samples in JAVA to consume spark RDD in JAVA using Kafka, If there is one please let me know JAVA中是否有任何样本可以使用Kafka在JAVA中消耗Spark RDD,如果有,请告诉我

I am trying out a few things but i am unsure as to how to send an RDD in a producer and consume it in Kafka consumer. 我正在尝试一些事情,但是我不确定如何在生产者中发送RDD并在Kafka消费者中使用它。

Thanks. 谢谢。

For writing your RDD to kafka you can use cloudera's API. 要将RDD写入kafka,可以使用cloudera的API。 Here is the link . 这是链接

Code will be like this: 代码将如下所示:

import org.apache.spark.streaming.kafka.*;

 JavaPairReceiverInputDStream<String, String> directKafkaStream = 
     KafkaUtils.createDirectStream(streamingContext,
         [key class], [value class], [key decoder class], [value decoder class],
         [map of Kafka parameters], [set of topics to consume]);

Have a look at kafka-integration and code example 看一下kafka集成代码示例

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM