简体   繁体   中英

Unable to create Java input dstreams using KafkaUtils :Spark 1.6.0

I am using Spark 1.6.0 and the artifact spark-streaming_2.11 API with Kafka to consume string messages.

As per the documentation I am trying to create a direct stream using Kafka utils but I am getting the below compiler error:

The method createDirectStream(JavaStreamingContext, Class, Class, Class, Class, Map, Set) in the type KafkaUtils is not applicable for the arguments (JavaStreamingContext, Class, Class, Class, Class, Map, Set)

Here is the code snippet I have written:

conf = new SparkConf().setAppName("Test Streaming App").setMaster("local[*]");
sc = new JavaSparkContext(conf);
ssc = new JavaStreamingContext(sc, new Duration(2000));
Map<String, String> kafkaParams = new HashMap<String, String>();
kafkaParams.put("metadata.broker.list", "localhost:9092");
Set<String> topics = Collections.singleton("test");
JavaPairInputDStream<String, String> dstream = KafkaUtils.createDirectStream(ssc, String.class, String.class, StringDecoder.class, StringDecoder.class,kafkaParams,topics);
ssc.start();
ssc.awaitTermination();

Is this some issue with the artifact and the version of Spark I am using? Please throw some light on this.

The issue is with the StringDecoder class.The class seems to be not accessible.

As per the below post

Can't acess kafka.serializer.StringDecoder

I have followed the procedure mentioned as per the accepted answer and resolved the issue.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM