[英]how to use function in filter condition pyspark
df=basedf.select("date","year","week","day").distinct()\
.orderBy("date")\
.withColumn("ISOWeek", concat("year","week"))\
.filter("date"<=min_date())
def min_date():
min_df=basedf.select("date")\
.orderBy("date")\
.agg(min("date").alias("date_min"))\
.select(date_add(col("date_min"),SelectedWeeks*7).alias("end_date"))
return min_df
#selectedweeks=10 (No. of weeks selected from user)
代碼的兩個部分都單獨工作,但我想將結束日期傳遞給過濾條件。
SelectedWeeks=10 (No. of weeks selected from user)
def min_date():
min_df=basedf.select("date")\
.orderBy("date")\
.agg(min("date").alias("date_min"))\
.select(date_add(col("date_min"), SelectedWeeks*7))
return min_df.collect()[0][0]
end_date = min_date()
df=basedf.select("date","year","week","day").distinct()\
.orderBy("date")\
.withColumn("ISOWeek", concat("year","week"))\
.filter(col("date") < end_date)
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.