简体   繁体   English

非 rdd spark scala 的 saveAsTextFile

[英]saveAsTextFile for non rdd spark scala

I have this code :我有这个代码:

def contentSizeStats(rdd: RDD[ApacheAccessLog]) = {
  val contentSizes = rdd.map(x=>x.contentSize).cache()
  val count = contentSizes.count()
   if (count == 0) { null }
   (count, contentSizes.reduce(_+_), contentSizes.min(), contentSizes.max())
}

The return value of this function is (Long, String, String, String), I am trying to save the result of this function to a text file using saveAsTextFile and I cant since the result is not an RDD,any idea how to implement this ?这个函数的返回值是 (Long, String, String, String),我试图使用 saveAsTextFile 将这个函数的结果保存到一个文本文件中,但我不能,因为结果不是一个 RDD,任何想法如何实现这个?

You can convert the return value to an RDD and then call saveAsTextFile on it.您可以将返回值转换为RDD ,然后对其调用saveAsTextFile

val outputRdd = sc.parallelize(outputTuple.productIterator.toArray)
outputRdd.saveAsTextFile(outputDirectory)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM