简体   繁体   中英

Parallelize bean class in spark

Can we parallelize the bean class in java spark? If yes please provide the syntax for the same. If no then how can we load bean class in JavaRDD?

Of course you can. Only requirement is that the class must be serializable.

Example:

public class A implements Serializable {
    private int x;
    public A() {}
    public A(int x) {this.x = x;}
    // getters and setters
}

// later in main class
import java.util.Arrays;
// later
JavaRDD<A> rdd = javaSparkContext.parallelize(Arrays.asList(new A(5)));

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM