简体   繁体   中英

Spark - serialization problem with parsing files using OpenCSV

I'm using Spark to process csv files. Recently I replaced manual CSV lines parsing with opencsv. Here is simplified code

public class Main {

    public static void main(String[] args) {

        CSVParser parser = new CSVParserBuilder()
                .withSeparator(';')
                .build();

        SparkConf cfg = new SparkConf()
                .setMaster("local[4]")
                .setAppName("Testapp");
        JavaSparkContext sc = new JavaSparkContext(cfg);

        JavaRDD<String> textFile = sc.textFile("testdata.csv", 1);

        List<String> categories = textFile
                .map(line -> parser.parseLine(line)[10])
                .collect();
        System.out.println(categories);
    }
}

Unfortunately that code doesn't work. It produces an exception

Caused by: java.io.NotSerializableException: com.opencsv.CSVParser
Serialization stack:
    - object not serializable (class: com.opencsv.CSVParser, value: com.opencsv.CSVParser@1290c49)
    - element of array (index: 0)
    - array (class [Ljava.lang.Object;, size 1)
    - field (class: java.lang.invoke.SerializedLambda, name: capturedArgs, type: class [Ljava.lang.Object;)
    - object (class java.lang.invoke.SerializedLambda, SerializedLambda[capturingClass=class test.Main, functionalInterfaceMethod=org/apache/spark/api/java/function/Function.call:(Ljava/lang/Object;)Ljava/lang/Object;, implementation=invokeStatic test/Main.lambda$main$49bd2722$1:(Lcom/opencsv/CSVParser;Ljava/lang/String;)Ljava/lang/String;, instantiatedMethodType=(Ljava/lang/String;)Ljava/lang/String;, numCaptured=1])
    - writeReplace data (class: java.lang.invoke.SerializedLambda)
    - object (class test.Main$$Lambda$19/429639728, test.Main$$Lambda$19/429639728@72456279)
    - field (class: org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1, name: fun$1, type: interface org.apache.spark.api.java.function.Function)
    - object (class org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1, <function1>)
    at org.apache.spark.serializer.SerializationDebugger$.improveException(SerializationDebugger.scala:40)
    at org.apache.spark.serializer.JavaSerializationStream.writeObject(JavaSerializer.scala:46)
    at org.apache.spark.serializer.JavaSerializerInstance.serialize(JavaSerializer.scala:100)
    at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:400)
    ... 12 more

It seems Spark tries to serialize lambda expression and somehow lamba expression keeps reference to parser which causes aforementioned error.

The question is: is there any way to avoid that exception and use non-serializable library in lambda expression passed to Spark? I really don't want to implement my own csv parser.

Spark supports CSV files out of the box

import org.apache.spark.sql.Row;
import org.apache.spark.sql.Dataset;

Dataset<Row> df = spark.read().format("csv")
                      .option("sep", ";")
                      .option("header", "true") //or "false" if no headers
                      .load("filename.csv");

Edit (promoted comment to main answer)

if you really need it you can get the RDD from the DataFrame with df.javaRDD() though it is preferable to use the DataSet/DataFrame API (see here for example)

I realized that there is very simple solution to my problem. Any external library usage which causes serialization problem may be wrapped in a static method. Reference to parser is hidden by method parse . This approach is obviously not a perfect solution but works.

public class Main {

    private static CSVParser parser = new CSVParserBuilder()
            .withSeparator(';')
            .build();

    public static void main(String[] args) {
        SparkConf cfg = new SparkConf()
                .setMaster("local[4]")
                .setAppName("Testapp");
        JavaSparkContext sc = new JavaSparkContext(cfg);

        JavaRDD<String> textFile = sc.textFile("testdata.csv", 1);

        List<String> categories = textFile
                .map(line -> parse(line)[0])
                .collect();
        System.out.println(categories);
    }

    static String[] parse(String line) throws IOException {
        return parser.parseLine(line);
    }
}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM