简体   繁体   中英

Scala compilation error on nested pattern matching

I am using Scala 2.10.5 and Spark 1.6.0.

My code is:

  def computeContribs(tuples : RDD[(Int,List[Int])], ranks : RDD[(Int, Double)] ) : RDD[(Int, Double)] = {
    val x = tuples.map{case(numpage,list)=>(numpage,(list,1.0/list.size))}
    val joined = x.join(ranks)
    val contribs = joined.flatMap{case(numPage,(pageList,size), rank) => pageList.map( y:Int=>(y,size) )}
    contribs.reduceByKey(_+_)
  }

The compilation error I get is:

[error] C:\Users\vital\Documents\spazi_lavoro\spark-examples\src\main\scala\pagerank\PageRankSpark.scala:11: constructor cannot be instantiated to expected type;
[error]  found   : (T1, T2, T3)
[error]  required: (Int, ((List[Int], Double), Double))
[error]     val contribs = joined.flatMap{case(numPage,(pageList,size), rank) => pageList.map( y:Int=>(y,size) )}
[error]

                                   ^

How can I make my code compile?

Your joined RDD is of the type RDD[(Int, ((List[Int], Double), Double))] so while applying flatmap to this RDD you need Case Class of of the same type ie (numPage,((pageList,size), rank)) not (numPage,(pageList,size), rank) . Also you don't need to specify type ( Int ) in your map function. You can change your contribs RDD to following.

val contribs: RDD[(Int, Double)] = 
    joined.flatMap{case (numPage,((pageList,size), rank)) => 
    pageList.map(y => (y , size) )}

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM