简体   繁体   English

Spark sql 加入两个没有主键的数据帧

[英]Spark sql to join two dataframes with no primary keys

I have the following 2 dataframes which i want to join to create new schema data:我有以下 2 个数据框,我想加入这些数据框以创建新的模式数据:

df = sqlContext.createDataFrame([("A011021","15","2020-01-01","2020-12-31","4"),("A011021","15","2020-01-01","2020-12-31","4"),("A011021","15","2020-01-01","2020-12-31","4"),("A011021","15","2020-01-01","2020-12-31","3")], ["rep_id","sales_target","start_date","end_date","st_new"])
df2.createOrReplaceTempView('df')
+--------------+------------+----------+----------+------+
rep_id         |sales_target|start_date|end_date  |st_new|
+--------------+------------+----------+----------+-------
|A011021       |15          |2020-01-01|2020-12-31|4     |
|A011021       |15          |2020-01-01|2020-12-31|4     |
|A011021       |15          |2020-01-01|2020-12-31|4     |
|A011021       |15          |2020-01-01|2020-12-31|3     |
|A011022       |6           |2020-01-01|2020-12-31|3     |
|A011022       |6           |2020-01-01|2020-12-31|3     |
+--------------+------------+----------+----------+-------

df2 = sqlContext.createDataFrame([("A011021","15","2020-01-01","2020-12-31","2020-01-01","2020-03-31"),("A011021","15","2020-01-01","2020-12-31","2020-04-01","2020-06-30"),("A011021","15","2020-01-01","2020-12-31","2020-07-01","2020-09-30"),("A011021","15","2020-01-01","2020-12-31","2020-10-01","2020-12-31")], ["rep_id","sales_target","start_date","end_date","new_sdt","new_edt"])
df2.createOrReplaceTempView('df2')
+--------------+------------+----------+----------+-----------+----------+
rep_id         |sales_target|start_date|end_date  |new_sdt    |new_edt   |
+--------------+------------+----------+----------------------+----------+
|A011021       |15          |2020-01-01|2020-12-31|2020-01-01 |2020-03-31|
|A011021       |15          |2020-01-01|2020-12-31|2020-04-01 |2020-06-30|
|A011021       |15          |2020-01-01|2020-12-31|2020-07-01 |2020-09-30|
|A011021       |15          |2020-01-01|2020-12-31|2020-10-01 |2020-12-31|
|A011022       |6           |2020-01-01|2020-06-30|2020-01-01 |2020-03-31|
|A011022       |6           |2020-01-01|2020-06-30|2020-04-01 |2020-06-30|
+--------------+------------+----------+----------------------+----------+

When I run the query to join the two tables am getting duplicate results as below:当我运行查询以加入两个表时,我得到如下重复的结果:

select ds1.*,ds2.st_new from df2 ds2
inner join df1 ds1
on ds2.rep_id=ds1.rep_id
where ds2.rep_id='A011021'

+--------------+------------+----------+----------+------+-----------+----------+
rep_id         |sales_target|start_date|end_date  |st_new|new_sdt    |new_edate |
+--------------+------------+----------+----------+------------------+----------+
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-01-01 |2019-12-31|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-01-01 |2019-12-31|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-01-01 |2019-12-31|
|A011021       |15          |2020-01-01|2020-12-31|3     |2020-01-01 |2020-03-31|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-04-01 |2020-03-31|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-04-01 |2020-03-31|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-04-01 |2020-03-31|
|A011021       |15          |2020-01-01|2020-12-31|3     |2020-04-01 |2020-06-30|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-07-01 |2020-06-30|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-07-01 |2020-06-30|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-07-01 |2020-06-30|
|A011021       |15          |2020-01-01|2020-12-31|3     |2020-07-01 |2020-09-30|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-10-01 |2020-09-30|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-10-01 |2020-09-30|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-10-01 |2020-09-30|
|A011021       |15          |2020-01-01|2020-12-31|3     |2020-10-01 |2020-12-30|
+--------------+------------+----------+----------+------------------+----------+

Is there a way to fetch only distinct new_sdt, new_edt for a given rep_id, quarterly data using the spark_sql, or pyspark functions, please help.有没有办法使用 spark_sql 或 pyspark 函数仅获取给定 rep_id 的不同 new_sdt、new_edt、季度数据,请帮助。

Expected result is:预期结果是:

select ds1.*,ds2.st_new from df2 ds2
inner join df1 ds1
on ds2.rep_id=ds1.rep_id

+--------------+------------+----------+----------+------+-----------+----------+
rep_id         |sales_target|start_date|end_date  |st_new|new_sdt    |new_edt |
+--------------+------------+----------+----------+------------------+----------+
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-01-01 |2020-03-31|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-04-01 |2020-06-30|
|A011021       |15          |2020-01-01|2020-12-31|4     |2020-07-01 |2020-09-30|
|A011021       |15          |2020-01-01|2020-12-31|3     |2020-10-01 |2020-12-31|
|A011022       |6           |2020-01-01|2020-12-31|3     |2020-01-01 |2020-03-31|
|A011022       |6           |2020-01-01|2020-12-31|3     |2020-04-01 |2020-06-30|
+--------------+------------+----------+----------+------------------+----------+
  1. Allot unique id分配唯一 ID
  2. Do a inner join做一个内部连接
  3. Drop the extra columns删除多余的列
import org.apache.spark.sql.functions._

object InnerJoin {

  def main(args: Array[String]): Unit = {
    val spark = Constant.getSparkSess

    import spark.implicits._

    val df1 = List(
      ("A011021", "15", "2020-01-01", "2020-12-31", "4"),
      ("A011021", "15", "2020-01-01", "2020-12-31", "4"),
      ("A011021", "15", "2020-01-01", "2020-12-31", "4"),
      ("A011021", "15", "2020-01-01", "2020-12-31", "3"),
      ("A011022", "6" , "2020-01-01", "2020-12-31", "3"),
      ("A011022", "6" , "2020-01-01", "2020-12-31", "3"))
      .toDF("rep_id","sales_target","start_date","end_date","st_new")
      .withColumn("rowid",monotonically_increasing_id())

    val df2 = List(
      ("A011021","15","2020-01-01","2020-12-31","2020-01-01","2020-03-31"),
      ("A011021","15","2020-01-01","2020-12-31","2020-04-01","2020-06-30"),
      ("A011021","15","2020-01-01","2020-12-31","2020-07-01","2020-09-30"),
      ("A011021","15","2020-01-01","2020-12-31","2020-10-01","2020-12-31"),
      ("A011022","6" ,"2020-01-01","2020-06-30","2020-01-01","2020-03-31"),
      ("A011022","6" ,"2020-01-01","2020-06-30","2020-04-01","2020-06-30"))
      .toDF("rep_id","sales_target","start_date","end_date","new_sdt","new_edt")
      .withColumn("rowid",monotonically_increasing_id())


    df1.as("ds1").join(df2.as("ds2"),
      col("ds1.rowid") === col("ds2.rowid"),
      "inner")
      .orderBy(col("ds1.rep_id"),col("ds1.sales_target"),col("st_new").desc)
      .drop("rowid")
      .show()
  }

}

Desired output所需 output

+-------+------------+----------+----------+------+-------+------------+----------+----------+----------+----------+
| rep_id|sales_target|start_date|  end_date|st_new| rep_id|sales_target|start_date|  end_date|   new_sdt|   new_edt|
+-------+------------+----------+----------+------+-------+------------+----------+----------+----------+----------+
|A011021|          15|2020-01-01|2020-12-31|     4|A011021|          15|2020-01-01|2020-12-31|2020-04-01|2020-06-30|
|A011021|          15|2020-01-01|2020-12-31|     4|A011021|          15|2020-01-01|2020-12-31|2020-01-01|2020-03-31|
|A011021|          15|2020-01-01|2020-12-31|     4|A011021|          15|2020-01-01|2020-12-31|2020-07-01|2020-09-30|
|A011021|          15|2020-01-01|2020-12-31|     3|A011021|          15|2020-01-01|2020-12-31|2020-10-01|2020-12-31|
|A011022|           6|2020-01-01|2020-12-31|     3|A011022|           6|2020-01-01|2020-06-30|2020-04-01|2020-06-30|
|A011022|           6|2020-01-01|2020-12-31|     3|A011022|           6|2020-01-01|2020-06-30|2020-01-01|2020-03-31|
+-------+------------+----------+----------+------+-------+------------+----------+----------+----------+----------+

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM