简体   繁体   中英

Spark's call function

Sorry, basic question about Spark. Can I use Java object different than primitive type inside a Spark's call function? For instance, imagine that I have something like that:

JavaRDD<String> input = sc.textFile(dataFile);
    JavaRDD<String> output;
    output = input.map(new Function<String, String>() { public String call(String s) throws MalformedURLException {
            SystemConfiguration config = new SystemConfiguration();

....

If I remove the instance of my own class SystemConfiguratin it works fine, but with it inside it doesn't work (Spark finished with failure). Please could you shed light about this? Many thanks

只要您在转换中创建的对象是可序列化的或声明为@transient lazy(对于不可序列化的对象),就可以了。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM