简体   繁体   English

值collectAsMap不是org.apache.spark.rdd.RDD的成员

[英]value collectAsMap is not a member of org.apache.spark.rdd.RDD

I am trying to use collectAsMap() in the following statement: 我正在尝试在以下语句中使用collectAsMap()

import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.broadcast.Broadcast
import org.apache.spark.rdd.RDD
...
documents_input.
  filter(_ != documents_header).
  map(_.split(",")).
  map(Document.parse(_)).
  keyBy((_.id)).collectAsMap()

However I am getting the following error: 但是我收到以下错误:

value collectAsMap is not a member of org.apache.spark.rdd.RDD[(Int, `com.codependent.MyApp.Document)]`

Any idea why or how I could turn the Array into a Map? 知道为什么或如何将Array变成Map吗?

Fixed after updating the imports as Ram Ghadiyaram suggested: 在更新导入后已修复,如Ram Ghadiyaram建议的那样:

import org.apache.spark.SparkConf
import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.broadcast.Broadcast
import org.apache.spark.rdd.RDD

It depends on how you read the documents_input . 这取决于您如何阅读documents_input If you read it using sparkContext then you should be able to use collectAsMap . 如果使用sparkContext阅读它,则应该可以使用collectAsMap But if you have read the documents_input as Source or any other scala api then collectAsMap won't do the trick. 但是,如果您已将documents_input阅读为Source或任何其他scala api那么collectAsMapcollectAsMap In that case you can use toMap 在这种情况下,您可以使用toMap

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM