[英]NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer
sc.newAPIHadoopRDD
is continuously giving me the error. sc.newAPIHadoopRDD
一直在给我错误。
val hBaseRDD = sc.newAPIHadoopRDD(hbase_conf, classOf[TableInputFormat], classOf[org.apache.hadoop.hbase.io.ImmutableBytesWritable], classOf[org.apache.hadoop.hbase.client.Result]);
java.lang.NoSuchMethodError: ava.lang.NoSuchMethodError: com.fasterxml.jackson.module.scala.deser.BigDecimalDeserializer$.handledType()Ljava/lang/Class;
at com.fasterxml.jackson.module.scala.deser.NumberDeserializers$.<init>(ScalaNumberDeserializersModule.scala:49)
at com.fasterxml.jackson.module.scala.deser.NumberDeserializers$.<clinit>(ScalaNumberDeserializersModule.scala)
at com.fasterxml.jackson.module.scala.deser.ScalaNumberDeserializersModule$class.$init$(ScalaNumberDeserializersModule.scala:61)
at com.fasterxml.jackson.module.scala.DefaultScalaModule.<init>(DefaultScalaModule.scala:20)
at com.fasterxml.jackson.module.scala.DefaultScalaModule$.<init>(DefaultScalaModule.scala:37)
at com.fasterxml.jackson.module.scala.DefaultScalaModule$.<clinit>(DefaultScalaModule.scala)
at org.apache.spark.rdd.RDDOperationScope$.<init>(RDDOperationScope.scala:82)
at org.apache.spark.rdd.RDDOperationScope$.<clinit>(RDDOperationScope.scala)
at org.apache.spark.SparkContext.withScope(SparkContext.scala:701)
at org.apache.spark.SparkContext.newAPIHadoopRDD(SparkContext.scala:1132)
I am trying to fetch values from Hbase.it is working perfectly fine in my local system already gone through many other answers related to this topic but nothing has helped me out yet. 我正在尝试从Hbase获取值。在我的本地系统中,它已经通过与该主题相关的许多其他答案进行了很好的工作,但效果还不错。
But whenever I try to run it on my cluster it gives me error as mentioned above. 但是,每当我尝试在群集上运行它时,都会出现上述错误。
Already done all these imports 已经完成所有这些导入
import org.apache.hadoop.hbase.HBaseConfiguration
import org.apache.hadoop.hbase.mapreduce.TableInputFormat
import org.apache.hadoop.hbase.client.HBaseAdmin
import org.apache.hadoop.hbase.{HTableDescriptor,HColumnDescriptor}
import org.apache.hadoop.hbase.util.Bytes
import org.apache.hadoop.hbase.client.{Put,HTable}
import org.apache.hadoop.fs.Path
import org.apache.hadoop.hbase._
import org.apache.hadoop.hbase.client._
import org.apache.hadoop.hbase.util._
import org.apache.spark._
import org.apache.hadoop.hbase.client.{HBaseAdmin, Result}
import org.apache.hadoop.hbase.{ HBaseConfiguration, HTableDescriptor }
import org.apache.hadoop.hbase.mapreduce.TableInputFormat
import org.apache.hadoop.hbase.io.ImmutableBytesWritable
Followed all the steps of installation as well as classpath from 遵循了所有安装步骤以及从
https://acadgild.com/blog/apache-spark-hbase/ https://acadgild.com/blog/apache-spark-hbase/
Please Help me. 请帮我。
I got my problem after searching and exploring other jars 搜索和探索其他罐子后出现问题
My Hadoop Version 2.7.3
My Hbase Version 1.4.2
Libraries I was using was of version 1.4.2 only but straight away using them as 我使用的库仅是1.4.2版,但立即将它们用作
--driver-class-path $HBASE_HOME
As mentioned in the Link which I referred from. 正如我引用的链接中所提到的。 But the issue was it was giving me some JARS incompatibility and also multiple occurrences of JARS with same name. 但是问题是它给了我一些JARS不兼容的问题,并且多次出现具有相同名称的JARS。 Actually the only Dependencies that are needed to run Hbase successfully are 实际上,成功运行Hbase所需的唯一依赖项是
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase</artifactId>
<version>1.3.1</version>
<type>pom</type>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-server</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-common</artifactId>
<version>1.3.1</version>
</dependency>
Updated Libraries of Hbase doesnt have the required classes to implement Hbase. 更新的Hbase库没有实现Hbase所需的类。 After using these version of libraries it worked for me perfectly fine. 使用这些版本的库后,它对我来说非常好。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.