簡體   English   中英

groupbykey 之后的 spark rdd 過濾器

[英]spark rdd filter after groupbykey

//create RDD
val rdd = sc.makeRDD(List(("a", (1, "m")), ("b", (1, "m")),
             ("a", (1, "n")), ("b", (2, "n")), ("c", (1, "m")), 
             ("c", (5, "m")), ("d", (1, "m")), ("d", (1, "n"))))
val groupRDD = rdd.groupByKey()

在 groupByKey 之后我想過濾第二個元素不等於 1 並得到

("b", (1, "m")),("b", (2, "n")), ("c", (1, "m")), ("c", (5, "m"))

groupByKey()是必須的,可以幫助我,非常感謝。

添加:但是如果第二個元素類型是string,過濾第二個元素 全部等於x ,like ("a",("x","m")), ("a",("x","n")), ("b",("x","m")), ("b",("y","n")), ("c",("x","m")), ("c",("z","m")), ("d",("x","m")), ("d",("x","n"))

並得到相同的結果("b",("x","m")), ("b",("y","n")), ("c",("x","m")), ("c",("z","m"))

你可以這樣做:

val groupRDD = rdd
  .groupByKey()
  .filter(value => value._2.map(tuple => tuple._1).sum != value._2.size)
  .flatMapValues(list => list) // to get the result as you like, because right now, they are, e.g. (b, Seq((1, m), (1, n)))

這是做什么的,我們首先通過groupByKey對鍵進行分組,然后我們通過對分組條目中的鍵求和來過濾filter ,並檢查sum是否與分組條目的大小一樣多。 例如:

(a, Seq((1, m), (1, n))   -> grouped by key
(a, Seq((1, m), (1, n), 2 (the sum of 1 + 1), 2 (size of sequence))
2 = 2, filter this row out

最終結果:

(c,(1,m))
(b,(1,m))
(c,(5,m))
(b,(2,n))

祝你好運!

編輯

假設元組中的key可以是任何字符串; 假設rdd是您的數據,其中包含:

(a,(x,m))
(c,(x,m))
(c,(z,m))
(d,(x,m))
(b,(x,m))
(a,(x,n))
(d,(x,n))
(b,(y,n))

然后我們可以構造uniqueCount為:

val uniqueCount = rdd
  // we swap places, we want to check for combination of (a, 1), (b, a), (b, b), (c, a), etc.
  .map(entry => ((entry._1, entry._2._1), entry._2._2))
  // we count keys, meaning that (a, 1) gives us 2, (b, a) gives us 1, (b, b) gives us 1, etc.
  .countByKey()
  // we filter out > 2, because they are duplicates
  .filter(a => a._2 == 1)
  // we get the very keys, so we can filter below
  .map(a => a._1._1)
  .toList

然后這個:

val filteredRDD = rdd.filter(a => uniqueCount.contains(a._1))

給出這個 output:

(b,(y,n))
(c,(x,m))
(c,(z,m))
(b,(x,m))

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM