简体   繁体   English

注册Kryo类无效

[英]Registring Kryo classes is not working

I have the following code : 我有以下代码:

val conf = new SparkConf().setAppName("MyApp")
val sc = new SparkContext(conf)
conf.set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
new conf.registerKryoClasses(new Class<?>[]{
        Class.forName("org.apache.hadoop.io.LongWritable"),
        Class.forName("org.apache.hadoop.io.Text")
    });

But I am bumping into the following error : 但是我遇到了以下错误:

')' expected but '[' found.
[error]                 new conf.registerKryoClasses(new Class<?>[]{

How can I solve this problem ? 我怎么解决这个问题 ?

You're mixing Scala and Java. 您正在混合Scala和Java。 In Scala, you can define an Array[Class[_]] (instead of a Class<?>[] ): 在Scala中,您可以定义Array[Class[_]] (而不是Class<?>[] ):

val conf = new SparkConf()
            .setAppName("MyApp")
            .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
            .registerKryoClasses(Array[Class[_]](
              Class.forName("org.apache.hadoop.io.LongWritable"),
              Class.forName("org.apache.hadoop.io.Text")
            ));

val sc = new SparkContext(conf)

We can even do a little better. 我们甚至可以做得更好。 In order not to get our classes wrong using string literals, we can actually utilize the classes and use classOf to get their class type: 为了避免使用字符串文字弄错我们的类,我们实际上可以利用这些类并使用classOf来获取其类类型:

import org.apache.hadoop.io.LongWritable
import org.apache.hadoop.io.Text

val conf = new SparkConf()
            .setAppName("MyApp")
            .set("spark.serializer", "org.apache.spark.serializer.KryoSerializer")
            .registerKryoClasses(Array[Class[_]](
              classOf[LongWritable],
              classOf[Test],
            ))

val sc = new SparkContext(conf)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM