簡體   English   中英

使用LocalDateTime的Spark序列化錯誤

[英]Spark serialization error using LocalDateTime

代碼-

val rdd=sc.textFile("/tmp/abc.csv")
rdd.first.split(",").zipWithIndex
val rows=rdd.filter(x => !x.contains("ID") && !x.contains("Case Number"))
val split1=rows.map(x => x.split(","))
split1.take(3)
import java.time._
import java.time.format._
val format=DateTimeFormatter.ofPattern("MM/dd/yyyy h:m:s a")
val dates=split1.map( x => LocalDateTime.parse( x(2) , format))

錯誤:

org.apache.spark.SparkException:無法在org.apache.spark.util.ClosureCleaner $ .ensureSerializable(ClosureCleaner.scala:304)上序列化的任務

解決這個問題的比較丑陋的方法是在匿名函數中推送格式初始化:

split1.map(x => 
  LocalDateTime.parse(x(2), DateTimeFormatter.ofPattern("MM/dd/yyyy h:m:s a")))

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM