简体   繁体   中英

Calculate days from last deal date, a Time Series implementation from Pandas to Pyspark using Window function

pandas_output CODE IN PANDAS

# Calculate days since last deal for customer / master customer
df['booked_date_day'] = pd.to_datetime(df['booked_date_day'])
df['customer_whole_days_from_last_deal'] = df[['customer_nbr', 'booked_date_day']].sort_values(['customer_nbr', 'booked_date_day']).drop_duplicates().groupby('customer_nbr')['booked_date_day'].diff()
df['customer_whole_days_from_last_deal'] = df.sort_values(['customer_nbr', 'booked_date_day']).groupby('customer_nbr')['customer_whole_days_from_last_deal'].fillna(method='ffill')
df['customer_whole_days_from_last_deal'] = df['customer_whole_days_from_last_deal'].dt.days
df['master_customer_whole_days_from_last_deal'] = df[['master_cust', 'booked_date_day']].sort_values(['master_cust', 'booked_date_day']).drop_duplicates().groupby('master_cust')['booked_date_day'].diff()
df['master_customer_whole_days_from_last_deal'] = df.sort_values(['master_cust', 'booked_date_day']).groupby('master_cust')['master_customer_whole_days_from_last_deal'].fillna(method='ffill')
df['master_customer_whole_days_from_last_deal'] = df['master_customer_whole_days_from_last_deal'].dt.days

CODE I DEVELOPED IN PYSPARK::: the above dataframe pandas is df and below the spark dataframe is orders

window1 = Window.partitionBy(orders.customer_nbr).orderBy(orders.booked_date_day)

orders = orders.withColumn("customer_whole_days_from_last_deal", F.datediff(orders.booked_date_day, F.lag(orders.booked_date_day, 1).over(window1)))

window2 = Window.partitionBy(orders.master_cust).orderBy(orders.booked_date_day)

orders = orders.withColumn("master_whole_days_from_last_deal", F.datediff(orders.booked_date_day, F.lag(orders.booked_date_day, 1).over(window2)))


window_ff1= Window.partitionBy(orders.customer_nbr).orderBy(orders.booked_date_day).rowsBetween(-sys.maxsize, 0)
filled_column1 = last(orders['customer_whole_days_from_last_deal'], ignorenulls=True).over(window_ff1)
orders = orders.withColumn('customer_whole_days_from_last_deal', filled_column1)

window_ff2= Window.partitionBy(orders.master_cust).orderBy(orders.booked_date_day).rowsBetween(-sys.maxsize, 0)
filled_column2 = last(orders['master_whole_days_from_last_deal'], ignorenulls=True).over(window_ff2)
orders = orders.withColumn('master_whole_days_from_last_deal', filled_column2)

I am trying to calculate days since last deal for customer / master customer, please let me know what wrong I am doing in my pyspark as I am getting one row in pandas but getting more rows in pyspark.

pyspark_output

df1 = orders.select('customer_nbr','booked_date_day').sort("customer_nbr","booked_date_day").dropDuplicates()
window1 = Window.partitionBy(df1.customer_nbr).orderBy(df1.booked_date_day)
df1 = df1.withColumn("customer_whole_days_from_last_deal", F.datediff(df1.booked_date_day, F.lag(df1.booked_date_day, 1).over(window1)))
window_ff1= Window.partitionBy(df1.customer_nbr).orderBy(df1.booked_date_day).rowsBetween(-sys.maxsize, 0)
filled_column1 = last(df1['customer_whole_days_from_last_deal'], ignorenulls=True).over(window_ff1)
df1 = df1.withColumn('customer_whole_days_from_last_deal', filled_column1)


df2 = orders.select('master_cust','booked_date_day').sort("master_cust","booked_date_day").dropDuplicates()
window2 = Window.partitionBy(df2.master_cust).orderBy(df2.booked_date_day)
df2 = df2.withColumn("master_whole_days_from_last_deal", F.datediff(df2.booked_date_day, F.lag(df2.booked_date_day, 1).over(window2)))
window_ff2= Window.partitionBy(df2.master_cust).orderBy(df2.booked_date_day).rowsBetween(-sys.maxsize, 0)
filled_column2 = last(df2['master_whole_days_from_last_deal'], ignorenulls=True).over(window_ff2)
df2 = df2.withColumn('master_whole_days_from_last_deal', filled_column2)

Then using broadcast join , I joined the dataframe 'df1' and 'df2' to 'orders' pyspark_final_output and this fixed my issue.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM