繁体   English   中英

无法将变量传递给pyspark中的Spark SQL查询

[英]Cannot pass variables to a spark sql query in pyspark

我有一个日期数据类型的python变量(我正在使用pyspark):变量值为2016-10-31

print type(load_dt)

 >> <type 'datetime.date'>

我很难将其传递给sparksql查询:

    hive_context.sql("select * from  tbl t1 where cast (substring(t1.dt,1,10) as date) ={0}".format(load_dt));

    Error:

    u"cannot resolve '(cast(substring(dt,1,10) as date) = ((2016 - 10) - 31))' due to data type mismatch: differing types in '(cast(substring(period_dt,1,10) as date) = ((2016 - 10) - 31))'
 (date and int)

添加引号:

"select * from  tbl t1 where cast (substring(t1.dt,1,10) as date) = '{0}'"

否则,日期将转换为2016-10-31字符串并解释为算术表达式:

2016 - 10 - 31 

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM