简体   繁体   中英

How to use redis in Spark Streaming

I am building an application that reads json elements from a list in redis aand streams them using spark. Here is what i have writen:

public void readTheStream() throws UnknownHostException, IOException {
        SparkConf sparkConf = new SparkConf().setMaster("local[*]").setAppName("Merge").set("redis.host", "localhost")
                .set("redis.port", "6379");;

        JavaSparkContext ctx = JavaSparkContext.fromSparkContext(SparkContext.getOrCreate(sparkConf));
        JavaStreamingContext context = new JavaStreamingContext(ctx, Durations.seconds(1));
}

how can i access redis with jssc object. thanks in advance.

Here is an example that reads from myList and prints list items to console:

SparkConf sparkConf = new SparkConf().setAppName("MyApp").setMaster("local[*]")
                .set("redis.host", "localhost")
                .set("redis.port", "6379");


JavaStreamingContext jssc = new JavaStreamingContext(sparkConf, Durations.milliseconds(1000));

RedisConfig redisConfig = new RedisConfig(new RedisEndpoint(sparkConf));

RedisStreamingContext redisStreamingContext = new RedisStreamingContext(jssc.ssc());
String[] keys = new String[]{"myList"};
RedisInputDStream<Tuple2<String, String>> redisStream =
        redisStreamingContext.createRedisStream(keys, StorageLevel.MEMORY_ONLY(), redisConfig);

redisStream.print();

jssc.start();
jssc.awaitTermination();

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM