简体   繁体   English

java.lang.AbstractMethodError:org.apache.phoenix.spark.DefaultSource.createRelation 使用 pheonix 在 Z77BB59DCD89559748E5DB56956C10601

[英]java.lang.AbstractMethodError:org.apache.phoenix.spark.DefaultSource.createRelation using pheonix in pyspark

I am trying to write a Spark dataframe to HBase using pheonix and I see the following error.我正在尝试使用 pheonix 将 Spark dataframe 写入 HBase,我看到以下错误。 Any idea what is going on here?知道这里发生了什么吗? :

An error occurred while calling o102.save.调用 o102.save 时出错。 : java.lang.AbstractMethodError: org.apache.phoenix.spark.DefaultSource.createRelation(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/Dataset;)Lorg/apache/spark/sql/sources/BaseRelation; : java.lang.AbstractMethodError: org.apache.phoenix.spark.DefaultSource.createRelation(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SaveMode/;Lscala/colle /spark/sql/Dataset;)Lorg/apache/spark/sql/sources/BaseRelation; at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:471) at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:50) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56) at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74) at org.apache.spark.ZAC5C74B64B4B8352 at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:471) at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:50) at org.apache .spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56) at org.apache.spark .sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74) at org.apache.4B8.ZAC2 EF2F181AFFB5AC2AZ.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117) at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116) at org.apache.spark.ZAC5C74B64B4 EF2F181AFFB5AC2AZ.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117) at org. apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark .sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116) at org.apache.spark.ZAC5C74B64B4 B8352EF2F181AFFB5AC2AZ.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:92) at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:92) at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:609) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:233) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.Z93F725A07423FE1C889F4 B8352EF2F181AFFB5AC2AZ.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:92) at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:92) at org.apache.spark.sql.DataFrameWriter.runCommand( DataFrameWriter.scala:609) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:233) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 62) 在 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 在 java.lang.reflect.Method.invoke(Method2344493F7925F407C 48B33D21F46Z:498) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:280) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:214) at java.lang.Thread.run(Thread.java:748) 48B33D21F46Z:498) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:280) at py4j. commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:214) at java.lang.Thread.run(Thread .java:748)

Traceback (most recent call last): File "/grid/1/hadoop/yarn/local/usercache/sifsuser/appcache/application_1569479196412_0065/container_e06_1569479196412_0065_01_000001/pyspark.zip/pyspark/sql/readwriter.py", line 593, in save self._jwrite.save() File "/grid/1/hadoop/yarn/local/usercache/sifsuser/appcache/application_1569479196412_0065/container_e06_1569479196412_0065_01_000001/py4j-0.10.4-src.zip/py4j/java_gateway.py", line 1133, in call answer, self.gateway_client, self.target_id, self.name) File "/grid/1/hadoop/yarn/local/usercache/sifsuser/appcache/application_1569479196412_0065/container_e06_1569479196412_0065_01_000001/pyspark.zip/pyspark/sql/utils.py", line 63, in deco return f(*a, **kw) File "/grid/1/hadoop/yarn/local/usercache/sifsuser/appcache/application_1569479196412_0065/container_e06_1569479196412_0065_01_000001/py4j-0.10.4-src.zip/py4j/protocol.py", line 319, in get_return_value format(target_id, ".", name), value) py4j.protocol.Py4JJavaError: An error occurred while cal回溯(最后一次调用):文件“/grid/1/hadoop/yarn/local/usercache/sifsuser/appcache/application_1569479196412_0065/container_e06_1569479196412_0065_01_000001/pyspark.zip/pyspark/sql/readwriter.py”,第 593 行,保存自._jwrite.save() 文件“/grid/1/hadoop/yarn/local/usercache/sifsuser/appcache/application_1569479196412_0065/container_e06_1569479196412_0065_01_000001/py4j-0.10.4-src.zip/py4j/java_gateway.py”,第113行呼叫应答,self.gateway_client,self.target_id,self.name)文件“/grid/1/hadoop/yarn/local/usercache/sifsuser/appcache/application_1569479196412_0065/container_e06_1569479196412_0065_01_000001/pyspark.zip/pyspark/sql/utils.py” , line 63, in deco return f(*a, **kw) File "/grid/1/hadoop/yarn/local/usercache/sifsuser/appcache/application_1569479196412_0065/container_e06_1569479196412_0065_01_000001/py4j-0.10.4-src.zip/py4j /protocol.py", line 319, in get_return_value format(target_id, ".", name), value) py4j.protocol.Py4JJavaError: An error occurred while cal ling o102.save. ling o102.save。 : java.lang.AbstractMethodError: org.apache.phoenix.spark.DefaultSource.createRelation(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SaveMode;Lscala/collection/immutable/Map;Lorg/apache/spark/sql/Dataset;)Lorg/apache/spark/sql/sources/BaseRelation; : java.lang.AbstractMethodError: org.apache.phoenix.spark.DefaultSource.createRelation(Lorg/apache/spark/sql/SQLContext;Lorg/apache/spark/sql/SaveMode/;Lscala/colle /spark/sql/Dataset;)Lorg/apache/spark/sql/sources/BaseRelation; at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:471) at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:50) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56) at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74) at org.apache.spark.ZAC5C74B64B4B8352 at org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:471) at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:50) at org.apache .spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58) at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56) at org.apache.spark .sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74) at org.apache.4B8.ZAC2 EF2F181AFFB5AC2AZ.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117) at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116) at org.apache.spark.ZAC5C74B64B4 EF2F181AFFB5AC2AZ.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117) at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:117) at org. apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:138) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark .sql.execution.SparkPlan.executeQuery(SparkPlan.scala:135) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:116) at org.apache.spark.ZAC5C74B64B4 B8352EF2F181AFFB5AC2AZ.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:92) at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:92) at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:609) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:233) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.Z93F725A07423FE1C889F4 B8352EF2F181AFFB5AC2AZ.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:92) at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:92) at org.apache.spark.sql.DataFrameWriter.runCommand( DataFrameWriter.scala:609) at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:233) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java: 62) 在 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 在 java.lang.reflect.Method.invoke(Method2344493F7925F407C 48B33D21F46Z:498) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:280) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:214) at java.lang.Thread.run(Thread.java:748) 48B33D21F46Z:498) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:280) at py4j. commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:214) at java.lang.Thread.run(Thread .java:748)

question is quite old but I face same issue so maybe it's worth answering.问题很老,但我面临同样的问题,所以也许值得回答。

In my case the issue was that I had insufficient conf in spark submit for就我而言,问题是我在 spark 提交中没有足够的 conf

spark.driver.extraClassPath spark.driver.extraClassPath

and

spark.executor.extraClassPath spark.executor.extraClassPath

when I add当我添加

    --conf "spark.driver.extraClassPath=phoenix-spark2.jar:phoenix-client.jar:/etc/hbase/conf"
    --conf "spark.executor.extraClassPath=phoenix-spark2.jar:phoenix-client.jar:/etc/hbase/conf"

it's works for me (versions: phoenix 4.7, hbase 1.1 and hartonworks hdp 2.6.5)它对我有用(版本:phoenix 4.7、hbase 1.1 和 hartonworks hdp 2.6.5)

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 java.lang.RuntimeException:org.apache.spark.SparkException:任务无法在solr.DefaultSource.createRelation处序列化 - java.lang.RuntimeException: org.apache.spark.SparkException: Task not serializable at solr.DefaultSource.createRelation java.lang.AbstractMethodError, org.apache.spark.internal.Logging$class.initializeLogIfNecessary - java.lang.AbstractMethodError, org.apache.spark.internal.Logging$class.initializeLogIfNecessary Phoenix“ org.apache.phoenix.spark.DefaultSource”错误 - Phoenix “org.apache.phoenix.spark.DefaultSource” error Apache Spark 2.4.0,AWS EMR,Spark Redshift和User类引发异常:java.lang.AbstractMethodError - Apache Spark 2.4.0, AWS EMR, Spark Redshift and User class threw exception: java.lang.AbstractMethodError Java Spark和cassandra连接中的java.lang.AbstractMethodError - java.lang.AbstractMethodError in Java Spark and cassandra connection How to resolve (java.lang.ClassNotFoundException: com.mongodb.spark.sql.DefaultSource.DefaultSource) in pyspark i'm using pycharm - How to resolve (java.lang.ClassNotFoundException: com.mongodb.spark.sql.DefaultSource.DefaultSource) in pyspark i'm using pycharm 使用java.lang.AbstractMethodError在cloudera上失败的spark kinesis - spark kinesis failing on cloudera with java.lang.AbstractMethodError pyspark 错误读取 bigquery: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class - pyspark error reading bigquery: java.lang.ClassNotFoundException: org.apache.spark.internal.Logging$class 使用Apache Spark 2.2.1的Spark流式传输-java.lang.NoClassDefFoundError:org / apache / spark / Logging - Spark Streaming using Apache Spark 2.2.1- java.lang.NoClassDefFoundError: org/apache/spark/Logging 如何修复“java.lang.ClassNotFoundException:org.apache.spark.internal.io.cloud.PathOutputCommitProtocol”Pyspark - How to fix "java.lang.ClassNotFoundException: org.apache.spark.internal.io.cloud.PathOutputCommitProtocol" Pyspark
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM