[英]Apache Spark Broadcast variables are type Broadcast? Not a RDD?
[英]Using broadcast variables in apache spark
在CoGroupFunction中使用廣播變量時出現錯誤。 如果對appProvider.value()
進行注釋,則錯誤消失。 您知道如何解決此問題嗎? 錯誤與變量定義或初始化有關嗎?
public class UsageJobDS implements Serializable{
private static final Logger log = org.apache.log4j.LogManager.getLogger("myLogger");
Broadcast<Provider> appProvider;
void init(){
// init broadcast variable
....
}
public static void main(String[] args) {
UsageJobDS ujb = new UsageJobDS();
ujb.init();
ujb.run();
}
void run(){
KeyValueGroupedDataset<Long, Row> charges = usageCharges.groupByKey(x -> x.getLong(x.fieldIndex("si__subscription_id")), Encoders.LONG());
Dataset<ProcessEdr> cogg = edrs.cogroup(charges, rateEDRs, Encoders.bean(ProcessEdr.class));
log.warn("Count cogg " + cogg.count());
}
CoGroupFunction<Long, EDR2, Row, ProcessEdr> rateEDRs = (subscription_id, edrsIter, chargesIter) -> {
Logger log = org.apache.log4j.LogManager.getLogger("myLogger");
log.warn("inside rateEDRs function");
while (edrsIter.hasNext()) {
appProvider.value(); // HERE
}
return results.iterator();
};
}
我得到這個錯誤
java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.opencell.spark.jobs.UsageJobDS.rateEDRs of type org.apache.spark.api.java.function.CoGroupFunction in instance of org.opencell.spark.jobs.UsageJobDS
at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2288)
實際上,如果將cogroup函數定義更改為以下代碼,則它可以工作。 但是,錯誤原因仍然未知。
Dataset<ProcessEdr> cogg = edrs.cogroup(charges, (subscription_id, edrsIter, chargesIter) -> {
ArrayList<ProcessEdr> results = new ArrayList<>();
System.out.println("App Provider name" + appProvider.value().getIssuer_name());
return results.iterator();
}, Encoders.bean(ProcessEdr.class));
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.