簡體   English   中英

Pyspark - 如何根據數據幀 2 中的列值在數據幀 1 中插入記錄

[英]Pyspark - How to insert records in dataframe 1, based on a column value in dataframe2

我需要使用 pyspark 的 spark.sql() 根據另一個表中的記錄數將記錄插入table1 ,比如table2 目前可以通過連接獲得一條記錄,但我需要根據第二個表將盡可能多的記錄插入到 table1 中。

我在這里提供示例數據幀:

df1= sqlContext.createDataFrame([("xxx1","81A01","TERR NAME 01"),("xxx1","81A01","TERR NAME 02"), ("xxx1","81A01","TERR NAME 03")], ["zip_code","zone_code","territory_name"])
df2= sqlContext.createDataFrame([("xxx1","81A01","","NY")], ["zip_code","zone_code","territory_name","state"])

df1.show()
+--------+--------------+--------------+
|zip_code|zone_code     |territory_name|
+--------+--------------+--------------+
|    xxx1|         81A01|  TERR NAME 01|
|    xxx1|         81A01|  TERR NAME 02|
|    xxx1|         81A01|  TERR NAME 03|
+---------------------------------------

# Print out information about this data
df2.show()
+--------+--------------+--------------+-----+
|zip_code|zone_code     |territory_name|state|
+--------+--------------+--------------+-----+     
|    xxx1|         81A01|  null        |   NY|
+---------------------------------------------

在上面的示例中,我需要根據 zip_code 將 df2 與 df1 連接起來,並獲得與 df1 中的地區名稱一樣多的記錄。

df2 中的預期結果是:

+--------+--------------+--------------+-----+
|zip_code|zone_code     |territory_name|state|
+--------+--------------+--------------+-----+     
|    xxx1|         81A01|  TERR NAME 01|   NY|
|    xxx1|         81A01|  TERR NAME 02|   NY|
|    xxx1|         81A01|  TERR NAME 03|   NY|
+---------------------------------------------

需要幫助,目前可以通過加入獲得一條記錄

Spark.sql query sample for getting one record:
    df1.createOrReplaceTempView('df1')
    df2.createOrReplaceTempView('df2')
    spark.sql("select a.zip_code,a.zone_code,b.territory_name,a.state from df1 a 
    left join df2 b on a.zip_code = b.zip_code where a.territory_name is null").createOrReplaceTempView('df2')

謝謝

想提供代碼片段,所以也許它對某些人有用。

df1= sqlContext.createDataFrame([("xxx1","81A01","TERR NAME 01"),("xxx1","81A01","TERR NAME 02"), ("xxx1","81A01","TERR NAME 03")], ["zip_code","zone_code","territory_name"])
df2= sqlContext.createDataFrame([("xxx1","","","NY"), ("xxx1","","TERR NAME 99","NY")], ["zip_code","zone_code","territory_name","state"])

df1.createOrReplaceTempView('df1')
df2.createOrReplaceTempView('df2')

spark.sql(“select * from df1”)
+--------+---------+--------------+ 
|zip_code|zone_code|territory_name| 
+--------+---------+--------------+ 
| xxx1   | 81A01   | TERR NAME 01 | 
| xxx1   | 81A01   | TERR NAME 02 | 
| xxx1   | 81A01   | TERR NAME 03 | 
+--------+---------+--------------+ 

spark.sql(“select * from df2”)
+--------+---------+--------------+-----+ 
|zip_code|zone_code|territory_name|state| 
+--------+---------+--------------+-----+ 
| xxx1   |         |              | NY  | 
| xxx1   |         | TERR NAME 99 | NY  | 
+--------+---------+--------------+-----+

spark.sql("""select a.zip_code, b.zone_code, b.territory_name, a.state from df2 a 
            left join df1 b 
            on a.zip_code = b.zip_code 
            where a.territory_name = ''
            UNION
            select a.zip_code, b.zone_code, a.territory_name, a.state from df2 a 
            left join df1 b 
            on a.zip_code = b.zip_code 
            where a.territory_name != ''
            """).createOrReplaceTempView('df3')


spark.sql(“select * from df3”)
+--------+---------+--------------+-----+ 
|zip_code|zone_code|territory_name|state| 
+--------+---------+--------------+-----+ 
| xxx1   | 81A01   | TERR NAME 03 | NY  | 
| xxx1   | 81A01   | TERR NAME 99 | NY  |  
| xxx1   | 81A01   | TERR NAME 01 | NY  | 
| xxx1   | 81A01   | TERR NAME 02 | NY  | 
+--------+---------+--------------+-----+

感謝那些提供幫助的人。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM