I am trying to implement a pagerank alghoritm on the reddit May2015 dataset but I can't manage to extract the subreddits referenced in the comments. A column contains the name of the subreddit and the other contains a comment posted in that subreddit that references another subreddit.
subreddit body
videos|"Tagged you as ""...
Quebec|Ok, c'est quoi le...
pokemon|Sorry to hear abo...
videos|Not sure what the...
ClashOfClans|Your submission, ...
realtech|Original /r/techn...
guns|Welp, those basta...
IAmA|If you are very i...
WTF|If you go on /r/w...
Fitness|Your submission h...
gifs|Hi! Take a look a...
Coachella|Yeah. If you go /...
What I did is this:
val df = spark.read
.format("csv")
.option("header", "true")
.load("path\\May2015.csv")
val df1 = df.filter(df("body").contains("/r/")).select("subreddit", "body")
val lines = df1.rdd
val links = lines.map{ s =>
val x = s(1).toString.split(" ")
val b = x.filter(_.startsWith("/r/")).toList
val t = b(0)
(s(0), t)
}.distinct().groupByKey().cache()
var ranks = links.mapValues(v =>0.25)
for (i <- 1 to iters) {
val contribs = links.join(ranks).values.flatMap{ case (urls, rank) =>
val size = urls.size
urls.map(url =(url, rank / size))
}
ranks = contribs.reduceByKey(_ + _).mapValues(0.15 + 0.85 * _)
}
Problem is that the output is always:
(subreddit, CompactBuffer())
While what I want is:
(subreddit, anothersubreddit)
I managed to solve this but now I am getting another error:
> type mismatch; found : org.apache.spark.rdd.RDD[(String, Double)]
> required: org.apache.spark.rdd.RDD[(Any, Double)] Note: (String,
> Double) <: (Any, Double), but class RDD is invariant in type T. You
> may wish to define T as +T instead. (SLS 4.5)
> ranks = contribs.reduceByKey(_ + _).mapValues(0.15 + 0.85 * _)
Probably the problem lies here
val links = lines.map{ s =>
val x = s(1).toString.split(" ")
val b = x.filter(_.startsWith("/r/")).toList
val t = b(0)
(s(0), t)
...
You need to avoid the first element of tuple as Any
here, so if you expect that s(0)
may have a type of String
you can use explicit cast like s(0).asInstanceOf[String]
or via method s.getAs[String]
or even s.getString(0)
.
So, the version that solves the compile error may be as follows:
val links = lines.map{ s =>
val x = s.getString(1).split(" ")
val b = x.filter(_.startsWith("/r/")).toList
val t = b(0)
(s.getString(0), t)
...
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.