簡體   English   中英

從pyspark中的數據框中提取數據

[英]Extract data from a dataframe in pyspark

我正在嘗試在 pyspark 中創建一個腳本,它將從表中獲取最小和最大日期將它們存儲在 df 中,然后將這兩個值拆分為 2 個變量,然后將這些變量作為時間范圍放置在另一個查詢中。 我的問題是日期是這樣的數據框

+--------+--------+
| maxDate| minDate|
+--------+--------+
|20210701|20210629|
+--------+--------+

我只想要 maxDate 和 minDate 的值。

我試過dates.iloc[0]var1 = dates['maxDate'].values[0]但它沒有用。

from pyspark.sql import SparkSession
from pyspark.sql.functions import when
from pyspark.sql import functions as F
from pyspark.sql.functions import lit
from pyspark.sql.functions import trim
from datetime import datetime


current_timestamp = datetime.strftime(datetime.now(), "%Y%m%d%H%M")

spark = SparkSession.builder.appName("testing") \
.config("hive.exec.dynamic.partition", "true") \
.config("hive.exec.dynamic.partition.mode", "nonstrict") \
.config("hive.exec.compress.output=false", "false") \
.config("spark.unsafe.sorter.spill.read.ahead.enabled", "false") \
.config("spark.debug.maxToStringFields", 1000)\
.enableHiveSupport() \
.getOrCreate()

spark.sql("set max_row_size = 6mb")
dates = spark.sql("SELECT MAX(date) as maxDate, MIN(date) as minDate FROM db.table")
 
#dates must be split here in two separated vars

result = spark.sql("select * from db.table_2 where date between {} and {}".format(var1,var2)

你可以像下面這樣

max_date = df.collect()[0][0]

min_date = df.collect()[0][1]

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM