[英]Scala join different datasets to get value for one column
I am new to Scala.我是 Scala 的新手。 I now have 3 tables.我现在有 3 张桌子。
A: A:
Marketplace市场 | Level等级 | Band乐队 |
---|---|---|
US我们 | LEVEL_1 1级 | |
CA加州 | LEVEL_1 1级 | BAND_1 BAND_1 |
B:乙:
Marketplace市场 | Level等级 | Value价值 |
---|---|---|
US我们 | LEVEL_1 1级 | 10 10 |
C: C:
Marketplace市场 | Level等级 | Band乐队 | Value价值 |
---|---|---|---|
CA加州 | LEVEL_1 1级 | BAND_1 BAND_1 | 20 20 |
I would want to:我想:
For rows with marketplace = US in table A -> join table B on Seq(Marketplace, Level) to get the Value;对于表 A 中具有市场 = US 的行 -> 在 Seq(Marketplace, Level) 上连接表 B 以获取值;
For rows with marketplace = CA in table A -> join table C on Seq(Marketplace, Level, Band) to get the Value.对于表 A 中具有市场 = CA 的行 -> 在 Seq(Marketplace, Level, Band) 上连接表 C 以获取值。
The output table will be like:输出表将如下所示:
Marketplace市场 | Level等级 | Band乐队 | Value价值 |
---|---|---|---|
US我们 | LEVEL_1 1级 | 10 10 | |
CA加州 | LEVEL_1 1级 | BAND_1 BAND_1 | 20 20 |
How should I write Scala code to achieve this?我应该如何编写 Scala 代码来实现这一点? Thanks!谢谢!
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.functions.{coalesce, col}
import spark.implicits._
val A = Seq(("US", "LEVEL_1", ""), ("CA", "LEVEL_1", "BAND_1"))
.toDF("Marketplace", "Level", "Band")
val B = Seq(("US", "LEVEL_1", 10)).toDF("Marketplace", "Level", "Value")
val C = Seq(("CA", "LEVEL_1", "BAND_1", 20)).toDF(
"Marketplace",
"Level",
"Band",
"Value"
)
val res = A
.join(B, A.col("Marketplace") === B.col("Marketplace"), "left")
.join(C, A.col("Marketplace") === C.col("Marketplace"), "left")
.select(
A.col("Marketplace").alias("Marketplace"),
A.col("Level").alias("Level"),
C.col("Band").alias("Band"),
coalesce(B.col("Value"), C.col("Value")).alias("Value")
)
res.show(false)
// +-----------+-------+------+-----+
// |Marketplace|Level |Band |Value|
// +-----------+-------+------+-----+
// |US |LEVEL_1|null |10 |
// |CA |LEVEL_1|BAND_1|20 |
// +-----------+-------+------+-----+
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.