簡體   English   中英

Hbase 0.96,Spark v 1.0+

[英]Hbase 0.96 with Spark v 1.0+

這種Hbase / Spark版本的組合似乎非常有毒。 我花了好幾個小時試圖找到各種可行的MergeStrategy,但無濟於事。

這是當前build.sbt的核心:

val sparkVersion = "1.0.0"
// val sparkVersion = "1.1.0-SNAPSHOT"

val hbaseVersion = "0.96.1.1-cdh5.0.2"

libraryDependencies ++= Seq(
    "org.apache.hbase" % "hbase-client" % hbaseVersion,
    "org.apache.hbase" % "hbase-common" % hbaseVersion,
    "org.apache.hbase" % "hbase-server" % hbaseVersion,
    "org.apache.hbase" % "hbase-protocol" % hbaseVersion,
    "org.apache.hbase" % "hbase-examples" % hbaseVersion,
  ("org.apache.spark" % "spark-core_2.10" % sparkVersion  withSources()).excludeAll(ExclusionRule("org.mortbay.jetty")),
  "org.apache.spark" % "spark-sql_2.10" % sparkVersion  withSources()
)

以下是不可避免地重現的錯誤消息:

14/06/27 19:49:24 INFO HttpServer: Starting HTTP Server
[error] (run-main-0) java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
java.lang.SecurityException: class "javax.servlet.FilterRegistration"'s signer information does not match signer information of other classes in the same package
        at java.lang.ClassLoader.checkCerts(ClassLoader.java:952)
        at java.lang.ClassLoader.preDefineClass(ClassLoader.java:666)
        at java.lang.ClassLoader.defineClass(ClassLoader.java:794)
        at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
        at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
        at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
        at org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:136)
        at org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:129)
        at org.eclipse.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:98)
        at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:98)
        at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:89)
        at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:65)
        at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:58)
        at org.apache.spark.ui.WebUI$$anonfun$attachTab$1.apply(WebUI.scala:58)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
        at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:58)
        at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:66)
        at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:60)
        at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:42)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:222)
        at org.apache.spark.SparkContext.<init>(SparkContext.scala:117)
        at com.huawei.swlab.sparkpoc.hbase.HBasePop$.main(HBasePop.scala:31)
        at com.huawei.swlab.sparkpoc.hbase.HBasePop.main(HBasePop.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
[trace] Stack trace suppressed: run last *:runMain for the full output.
14/06/27 19:49:44 INFO ConnectionManager: Selector thread was interrupted!
java.lang.RuntimeException: Nonzero exit code: 1

我的Spark / HBase應用程序得到了完全相同的異常。 我通過將org.mortbay.jetty排除規則移動到我的hbase-server依賴項來修復它:

libraryDependencies += "org.apache.hbase" % "hbase-server" % "0.98.6-cdh5.2.0" excludeAll ExclusionRule(organization = "org.mortbay.jetty")

如果你有一個直接依賴的hadoop-common ,那么我還發現有必要為javax.servlet depdendencies創建一個排除規則:

libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.5.0-cdh5.2.0" excludeAll ExclusionRule(organization = "javax.servlet")

我沒有觸及我的Spark依賴項:

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.1.0-cdh5.2.0"

libraryDependencies += "org.apache.spark" %% "spark-streaming" % "1.1.0-cdh5.2.0"

libraryDependencies += "org.apache.spark" %% "spark-streaming-kafka" % "1.1.0-cdh5.2.0"

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM