[英]Detected Guava issue #1635 when using Spark and Cassandra Java Driver
I am using spring-data-cassandra 1.5.1
(which uses cassandra java driver 3.x
) in our spark application. 我在我们的spark应用程序中使用
spring-data-cassandra 1.5.1
(使用cassandra java driver 3.x
)。 When running the spark-submit
command, I got the error below. 运行
spark-submit
命令时,出现以下错误。
Caused by: java.lang.IllegalStateException: Detected Guava issue #1635 which indicates that a version of Guava less than 16.01 is in use. This introduces codec resolution issues and potentially other incompatibility issues in the driver. Please upgrade to Guava 16.01 or later.
at com.datastax.driver.core.SanityChecks.checkGuava(SanityChecks.java:62)
at com.datastax.driver.core.SanityChecks.check(SanityChecks.java:36)
at com.datastax.driver.core.Cluster.<clinit>(Cluster.java:68)
... 71 more
It seems like cassandra driver is looking for Guava version > 16.0.1 and it is failing because it found version < 16.0.1. 似乎cassandra驱动程序正在寻找Guava版本> 16.0.1,但由于发现版本<16.0.1而失败。 I made sure that the
spark uber jar
which is built has only Guava version 19.0
. 我确保所构建的
spark uber jar
仅具有Guava version 19.0
。 But still I get the same error when I execute spark-submit
. 但是,当我执行
spark-submit
时,仍然出现相同的错误。
After further analysis, I found that spark-2.0.1-bin-hadoop2.7/jars
has Gava v14.0.1
and this is getting loaded when I execute spark-submit
without considering the Guava v19.0
in the spark application jar
. 经过进一步分析,我发现
spark-2.0.1-bin-hadoop2.7/jars
具有Gava v14.0.1
并且在我执行spark-submit
不考虑spark application jar
的Guava v19.0
情况下已Guava v19.0
了该文件。
Then I replaced the v14.0.1 with v19.0 in spark-2.0.1-bin-hadoop2.7/jars
and now I do not get any error and the application runs fine. 然后,我在
spark-2.0.1-bin-hadoop2.7/jars
中用v19.0替换了v14.0.1,现在我没有收到任何错误并且应用程序运行正常。 But I think this is not a good approach and do not want to do that in prod
. 但是我认为这不是一个好方法,也不想在
prod
这样做。
If I run the same spark job in eclipse
(by setting conf master=local in code and Run as Java program) it works fine. 如果我在
eclipse
运行相同的spark作业(通过在代码中设置conf master = local并作为Java程序运行),则可以正常工作。
I found similar issues in SO but did not find any resolution. 我在SO中发现了类似的问题,但是没有找到任何解决方案。 Let me know if anyone faced the same issue and has a resolution for this.
让我知道是否有人面临相同的问题并对此有解决方案。
Using Datastax Enterprise Cassandra 5.x
使用
Datastax Enterprise Cassandra 5.x
Thank You!!! 谢谢!!!
It is because spring-data-cassandra use cassandra java driver. 这是因为spring-data-cassandra使用cassandra java驱动程序。 Cassandra java driver hasn't to be included as it's explained here .
Cassandra Java驱动程序未包含在内,因为在此进行了解释。
Like @RussS said it : 就像@RussS所说的那样:
Look at https://github.com/datastax/spark-cassandra-connector/blob/master/doc/FAQ.md#how-do-i-fix-guava-classpath-errors 看看https://github.com/datastax/spark-cassandra-connector/blob/master/doc/FAQ.md#how-do-i-fix-guava-classpath-errors
Spark 2.0.1 has Guava 14.x jar, cassandra-java-driver
requires Guava version > 16.0.1. Spark 2.0.1具有Guava 14.x jar,
cassandra-java-driver
需要Guava版本> 16.0.1。 When we submit the spark job using spark-submit, the guava version in spark overrides the one that is in our spark application jar which results in the error in question. 当我们使用spark-submit提交spark作业时,spark中的番石榴版本将覆盖我们的spark应用程序jar中的版本,这会导致相关的错误。 The issue is resolved by overriding the
spark guava 14.x
jar with the guava 19.0.jar
这个问题是通过覆盖解决
spark guava 14.x
与罐子guava 19.0.jar
Overriding the spark guava 14.x jar by passing the config below in the spark submit command --conf spark.driver.extraClassPath=/path/to/guava-19.0.jar --conf spark.executor.extraClassPath=/path/to/guava-19.0.jar
通过在spark提交命令
--conf spark.driver.extraClassPath=/path/to/guava-19.0.jar --conf spark.executor.extraClassPath=/path/to/guava-19.0.jar
传递以下配置来覆盖spark guava 14.x jar --conf spark.driver.extraClassPath=/path/to/guava-19.0.jar --conf spark.executor.extraClassPath=/path/to/guava-19.0.jar
Make sure our spark application jar does not contain any guava dependency(exclude transitive dependencies as well) version < 16.0.1 ... or u can include latest versions in pom.xml such that that version will be included in final jar/war 确保我们的spark应用程序jar不包含任何guava依赖项(也包括传递依赖项)版本<16.0.1 ...或u可以在pom.xml中包含最新版本,以便该版本将包含在最终的jar / war中
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.