简体   繁体   中英

Is it possible to convert apache ignite rdd to spark rdd in scala

I am new to apache ignite as well as for spark... Can any one help with example to convert ignite rdd to spark rdd in scala.

Updated---- Use case: I will receive a dataframes of hbase tables.. I will execute some logic to build report out of it, save it to the ignite rdd... and same ignite rdd will be updated for each table... once all the tables are executed final ignite rdd will be converted to spark or java rdd and last rule will be executed on that rdd... to run that rule I need that rdd to be converted into dataframe. and that dataframe would be saved as a final report in hive...

What do you mean by converting? IgniteRDD is a Spark RDD , technically it' a subtype of RDD trait.

Spark internally has many type of RDDs: MappedRDD, HadoopRDD, LogicalRDD. IgniteRDD is only one of possible type of RDD and after some transformations it also will be wrapped by other RDD type, ie MappedRDD.

You can also write your own RDD :)

Example from documentation :

val cache = igniteContext.fromCache("partitioned")
val result = cache.filter(_._2.contains("Ignite")).collect()

After filtering cache RDD, type will be different - IgniteRDD will be wrapped to FilteredRDD. However it's still implementation of RDD trait.

Update after comment:

  1. At first, have you imported implicits? import spark.implicits._
  2. In SparkSession you've got various createDataFrame methods that will convert your RDD into DataFrame / Dataset

If it still not help you, please provide us error that you're getting while creating DataFrame and code example

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM