簡體   English   中英

Pyspark 中時間戳的滾動平均值和天數總和

[英]Rolling average and sum by days over timestamp in Pyspark

我有一個 PySpark dataframe ,其中時間戳以天為單位。 以下是 dataframe 的示例(我們稱之為df ):

+-----+-----+----------+-----+
| name| type| timestamp|score|
+-----+-----+----------+-----+
|name1|type1|2012-01-10|   11|
|name1|type1|2012-01-11|   14|
|name1|type1|2012-01-12|    2|
|name1|type3|2012-01-12|    3|
|name1|type3|2012-01-11|   55|
|name1|type1|2012-01-13|   10|
|name1|type2|2012-01-14|   11|
|name1|type2|2012-01-15|   14|
|name2|type2|2012-01-10|    2|
|name2|type2|2012-01-11|    3|
|name2|type2|2012-01-12|   55|
|name2|type1|2012-01-10|   10|
|name2|type1|2012-01-13|   55|
|name2|type1|2012-01-14|   10|
+-----+-----+----------+-----+

在這個 dataframe 中,我想在三天的滾動時間 window 內對不同名稱的分數進行平均,並取其總和 意思是,對於數據框的任何給定日期,查找該天、考慮日期前一天和考慮日期前一天前一天的name1總和 1 。 並為name1的所有日子做類似的事情。 並對各種names做同樣的練習,即。 name2等。我該怎么做?

我看了一下這個帖子,並嘗試了以下

from pyspark.sql import SparkSession
from pyspark.sql import functions as F
from pyspark.sql.window import Window

days = lambda i: i*1

w_rolling = Window.orderBy(F.col("timestamp").cast("long")).rangeBetween(-days(3), 0)
df_agg = df.withColumn("rolling_average", F.avg("score").over(w_rolling)).withColumn(
    "rolling_sum", F.sum("score").over(w_rolling)
)
df_agg.show()

+-----+-----+----------+-----+------------------+-----------+
| name| type| timestamp|score|   rolling_average|rolling_sum|
+-----+-----+----------+-----+------------------+-----------+
|name1|type1|2012-01-10|   11|18.214285714285715|        255|
|name1|type1|2012-01-11|   14|18.214285714285715|        255|
|name1|type1|2012-01-12|    2|18.214285714285715|        255|
|name1|type3|2012-01-12|    3|18.214285714285715|        255|
|name1|type3|2012-01-11|   55|18.214285714285715|        255|
|name1|type1|2012-01-13|   10|18.214285714285715|        255|
|name1|type2|2012-01-14|   11|18.214285714285715|        255|
|name1|type2|2012-01-15|   14|18.214285714285715|        255|
|name2|type2|2012-01-10|    2|18.214285714285715|        255|
|name2|type2|2012-01-11|    3|18.214285714285715|        255|
|name2|type2|2012-01-12|   55|18.214285714285715|        255|
|name2|type1|2012-01-10|   10|18.214285714285715|        255|
|name2|type1|2012-01-13|   55|18.214285714285715|        255|
|name2|type1|2012-01-14|   10|18.214285714285715|        255|
+-----+-----+----------+-----+------------------+-----------+

如您所見,我總是得到相同的滾動平均值和滾動總和,這只不過是所有天的列score的平均值和總和。 這不是我想要的。

您可以使用以下代碼片段創建上述 dataframe:


df_Stats = Row("name", "type", "timestamp", "score")

df_stat1 = df_Stats("name1", "type1", "2012-01-10", 11)
df_stat2 = df_Stats("name1", "type1", "2012-01-11", 14)
df_stat3 = df_Stats("name1", "type1", "2012-01-12", 2)
df_stat4 = df_Stats("name1", "type3", "2012-01-12", 3)
df_stat5 = df_Stats("name1", "type3", "2012-01-11", 55)
df_stat6 = df_Stats("name1", "type1", "2012-01-13", 10)
df_stat7 = df_Stats("name1", "type2", "2012-01-14", 11)
df_stat8 = df_Stats("name1", "type2", "2012-01-15", 14)
df_stat9 = df_Stats("name2", "type2", "2012-01-10", 2)
df_stat10 = df_Stats("name2", "type2", "2012-01-11", 3)
df_stat11 = df_Stats("name2", "type2", "2012-01-12", 55)
df_stat12 = df_Stats("name2", "type1", "2012-01-10", 10)
df_stat13 = df_Stats("name2", "type1", "2012-01-13", 55)
df_stat14 = df_Stats("name2", "type1", "2012-01-14", 10)

df_stat_lst = [
    df_stat1,
    df_stat2,
    df_stat3,
    df_stat4,
    df_stat5,
    df_stat6,
    df_stat7,
    df_stat8,
    df_stat9,
    df_stat10,
    df_stat11,
    df_stat12,
    df_stat13,
    df_stat14
]

df = spark.createDataFrame(df_stat_lst)

您可以使用下面的代碼來計算過去 3 天(包括當天)的得分總和和平均值。

# Considering the dataframe already created using code provided in question
df = df.withColumn('unix_time', F.unix_timestamp('timestamp', 'yyyy-MM-dd'))

winSpec = Window.partitionBy('name').orderBy('unix_time').rangeBetween(-2*86400, 0)

df = df.withColumn('rolling_sum', F.sum('score').over(winSpec))
df = df.withColumn('rolling_avg', F.avg('score').over(winSpec))

df.orderBy('name', 'timestamp').show(20, False)

+-----+-----+----------+-----+----------+-----------+------------------+
|name |type |timestamp |score|unix_time |rolling_sum|rolling_avg       |
+-----+-----+----------+-----+----------+-----------+------------------+
|name1|type1|2012-01-10|11   |1326153600|11         |11.0              |
|name1|type3|2012-01-11|55   |1326240000|80         |26.666666666666668|
|name1|type1|2012-01-11|14   |1326240000|80         |26.666666666666668|
|name1|type1|2012-01-12|2    |1326326400|85         |17.0              |
|name1|type3|2012-01-12|3    |1326326400|85         |17.0              |
|name1|type1|2012-01-13|10   |1326412800|84         |16.8              |
|name1|type2|2012-01-14|11   |1326499200|26         |6.5               |
|name1|type2|2012-01-15|14   |1326585600|35         |11.666666666666666|
|name2|type1|2012-01-10|10   |1326153600|12         |6.0               |
|name2|type2|2012-01-10|2    |1326153600|12         |6.0               |
+-----+-----+----------+-----+----------+-----------+------------------+

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM