簡體   English   中英

spark-sql中的更新語句

[英]update statement in spark-sql

有沒有一種方法可以使用spark-sql(使用scala語言)在sql服務器表上執行更新語句?

我需要執行以下查詢:

update  MyLog_table
set     Log_FileQueue = xx,
        Log_TotalLine = xx
where   Log_ID = xxx

我嘗試了以下語法:

 val jdbcUrl = s"jdbc:sqlserver://${jdbcHostname}:${jdbcPort};database=${jdbcDatabase}"
    val Log_FileIn = spark.read.jdbc(jdbcUrl, s"(select Log_FileIn from log Where   Log_ID = '${Process1Log_ID}' ) as sq", connectionProperties)
    val newLog_FileIn = Log_FileIn.collectAsList().toString().replace("[", "").replace("]", "")

 spark.sql(s"(select '${newLog_FileIn}' as Log_FileQueue, ${NbLine} as Log_TotalLine where Log_ID = '${newLog_id}')")
  .write
  .mode(SaveMode.Append)
  .jdbc(jdbcUrl, "Log", connectionProperties)

但它會產生以下錯誤:

org.apache.spark.sql.AnalysisException: cannot resolve '`Log_ID`' given input columns: []; line 1 pos 115;
'Project [test_141001.csv AS Log_FileQueue#290, 5 AS Log_TotalLine#29

我也嘗試使用“ where”方法:

spark.sql(s"(select '${newLog_FileIn}' as Log_FileQueue, ${NbLine} as Log_TotalLine where Log_ID = '${newLog_id}')")
  .where(s"Log_ID = '${newLog_id}'")
  .write
  .mode(SaveMode.Append)
  .jdbc(jdbcUrl, "Log", connectionProperties)

但它也不起作用。 我收到以下錯誤:

org.apache.spark.sql.AnalysisException: cannot resolve '`Log_ID`' given input columns: [Log_FileQueue, Log_TotalLine]; line 1 pos 0;
'Filter ('Log_ID = 157456)
+- AnalysisBarrier
      +- Project [ANNONCE-FNAC-VIGICOLIS-GRX-BIZ-2018hfgr071eyzdtrf2_141001.csv AS Log_FileQueue#290, 5 AS Log_TotalLine#291]

任何幫助將不勝感激

不能正常工作,請嘗試executeBatch。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM