簡體   English   中英

NoSuchMethodError:org.apache.spark.sql.catalyst.plans.logical.DeleteFromTable in Intellij

[英]NoSuchMethodError: org.apache.spark.sql.catalyst.plans.logical.DeleteFromTable in Intellij

我正在嘗試使用.delete()方法從增量表中刪除一條記錄,如下所示:

val my_dt = : DeltaTable = DeltaTable.forPath(ss, my_delta_path)
my_dt.delete("pk= '123456'")

當我在 Intellij 中運行我的代碼時,出現以下異常:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.sql.catalyst.plans.logical.DeleteFromTable.<init>(Lorg/apache/spark/sql/catalyst/plans/logical/LogicalPlan;Lorg/apache/spark/sql/catalyst/expressions/Expression;)V
    at io.delta.tables.execution.DeltaTableOperations.$anonfun$executeDelete$1(DeltaTableOperations.scala:44)
    at org.apache.spark.sql.delta.util.AnalysisHelper.improveUnsupportedOpError(AnalysisHelper.scala:104)
    at org.apache.spark.sql.delta.util.AnalysisHelper.improveUnsupportedOpError$(AnalysisHelper.scala:90)
    at io.delta.tables.DeltaTable.improveUnsupportedOpError(DeltaTable.scala:42)
    at io.delta.tables.execution.DeltaTableOperations.executeDelete(DeltaTableOperations.scala:41)
    at io.delta.tables.execution.DeltaTableOperations.executeDelete$(DeltaTableOperations.scala:41)
    at io.delta.tables.DeltaTable.executeDelete(DeltaTable.scala:42)
    at io.delta.tables.DeltaTable.delete(DeltaTable.scala:183)
    at io.delta.tables.DeltaTable.delete(DeltaTable.scala:172)

我讀到問題是我需要傳遞以下兩個參數:

--conf "spark.sql.extensions=io.delta.sql.DeltaSparkSessionExtension" --conf "spark.sql.catalog.spark_catalog=org.apache.spark.sql.delta.catalog.DeltaCatalog"

所以我嘗試通過幾種不同的方式傳遞它們但沒有成功:

首先在構建 spark session 時:

val ss = SparkSession.builder
      .master("local[*]")
      .config("spark.master", "local")
      .config("spark.sql.extensions", "io.delta.sql.DeltaSparkSessionExtension")
      .config("spark.sql.catalog.spark_catalog", "org.apache.spark.sql.delta.catalog.DeltaCatalog")
      .appName(appName)
      .getOrCreate()

也作為 Intellij 本身的程序參數:

在此處輸入圖像描述

此外,在 pom 中我添加了以下依賴項:

    <dependency>
        <groupId>io.delta</groupId>
        <artifactId>delta-core_2.12</artifactId>
        <version>2.1.0</version>
    </dependency>

    <dependency>
        <groupId>io.delta</groupId>
        <artifactId>delta-storage</artifactId>
        <version>2.1.0</version>
    </dependency>

但是異常仍然存在。 我錯過了什么?

作為附加信息,我可以在本地系統中寫入和讀取增量表而不會出現問題:

my_df.write
    .format("delta")
    .option("overwriteSchema", "true")
    .mode("overwrite")
    .save(my_path)

val my_dt : DeltaTable = DeltaTable.forPath(ss, my_path)

對於 Spark 3.2.x,您需要使用 Delta 2.0.x,或者要使用 Delta 2.1.x,您需要升級到 Spark 3.3.x。 檢查發布頁面以獲取 Spark/Delta 兼容性矩陣。

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM