繁体   English   中英

spark-sql中的更新语句

[英]update statement in spark-sql

有没有一种方法可以使用spark-sql(使用scala语言)在sql服务器表上执行更新语句?

我需要执行以下查询:

update  MyLog_table
set     Log_FileQueue = xx,
        Log_TotalLine = xx
where   Log_ID = xxx

我尝试了以下语法:

 val jdbcUrl = s"jdbc:sqlserver://${jdbcHostname}:${jdbcPort};database=${jdbcDatabase}"
    val Log_FileIn = spark.read.jdbc(jdbcUrl, s"(select Log_FileIn from log Where   Log_ID = '${Process1Log_ID}' ) as sq", connectionProperties)
    val newLog_FileIn = Log_FileIn.collectAsList().toString().replace("[", "").replace("]", "")

 spark.sql(s"(select '${newLog_FileIn}' as Log_FileQueue, ${NbLine} as Log_TotalLine where Log_ID = '${newLog_id}')")
  .write
  .mode(SaveMode.Append)
  .jdbc(jdbcUrl, "Log", connectionProperties)

但它会产生以下错误:

org.apache.spark.sql.AnalysisException: cannot resolve '`Log_ID`' given input columns: []; line 1 pos 115;
'Project [test_141001.csv AS Log_FileQueue#290, 5 AS Log_TotalLine#29

我也尝试使用“ where”方法:

spark.sql(s"(select '${newLog_FileIn}' as Log_FileQueue, ${NbLine} as Log_TotalLine where Log_ID = '${newLog_id}')")
  .where(s"Log_ID = '${newLog_id}'")
  .write
  .mode(SaveMode.Append)
  .jdbc(jdbcUrl, "Log", connectionProperties)

但它也不起作用。 我收到以下错误:

org.apache.spark.sql.AnalysisException: cannot resolve '`Log_ID`' given input columns: [Log_FileQueue, Log_TotalLine]; line 1 pos 0;
'Filter ('Log_ID = 157456)
+- AnalysisBarrier
      +- Project [ANNONCE-FNAC-VIGICOLIS-GRX-BIZ-2018hfgr071eyzdtrf2_141001.csv AS Log_FileQueue#290, 5 AS Log_TotalLine#291]

任何帮助将不胜感激

不能正常工作,请尝试executeBatch。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM