简体   繁体   English

无法使用Spark Cassandra连接器1.5.0连接Cassandra 3.0

[英]Unable to connect Cassandra 3.0 using Spark cassandra connector 1.5.0

Issue - Unable to connect Cassandra 3.0 using Spark Cassandra connector 1.5.0 问题 -无法使用Spark Cassandra连接器1.5.0连接Cassandra 3.0

Background - I tried to connect Cassandra 3.0 using from Spark 1.5.0 by using provided Spark cassandra connector 1.5.0 but I am getting below error - 背景 -我尝试通过使用提供的Spark cassandra连接器1.5.0从Spark 1.5.0连接使用Cassandra 3.0,但出现以下错误-

As per the DataStax Spark Cassandra Connector document , it says that Spark connector 1.5 can be used to Cassandra 3.0 from Spark 1.5.0/1.6.0. 根据DataStax Spark Cassandra Connector 文档 ,它说Spark连接器1.5可以用于Spark 1.5.0 / 1.6.0中的Cassandra 3.0。

Could you please suggest me is I am missing any step here? 您能否建议我在这里缺少任何步骤?

Tried approach 尝试的方法

  1. I tried to exclude Guava dependency from Spark streaming and core jar 我试图从Spark流和核心jar中排除Guava依赖项

  2. Added the separate Guava dependency in "pom.xml" 在“ pom.xml”中添加了单独的Guava依赖项

Thanks in advance. 提前致谢。


16/04/26 09:45:07 WARN TaskSetManager: Lost task 4.0 in stage 1.0 (TID 16, ip-172-31-23-23.ec2.internal): java.lang.ExceptionInInitializerError
    at  com.datastax.spark.connector.cql.DefaultConnectionFactory$.clusterBuilder(CassandraConnectionFactory.scala:35)
    at com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:87)
    at 
---
--
Caused by: java.lang.IllegalStateException: Detected Guava issue #1635 which indicates that a version of Guava less than 16.01 is in use.  This introduces codec resolution issues and potentially other incompatibility issues in the driver.  Please upgrade to Guava 16.01 or later.
    at com.datastax.driver.core.SanityChecks.checkGuava(SanityChecks.java:62)
    at com.datastax.driver.core.SanityChecks.check(SanityChecks.java:36)
    at com.datastax.driver.core.Cluster.<clinit>(Cluster.java:67)
    ... 23 more16/04/26 09:45:07 WARN TaskSetManager: Lost task 4.0 in stage 1.0 (TID 16, ip-172-31-23-23.ec2.internal): java.lang.ExceptionInInitializerError
    at com.datastax.spark.connector.cql.DefaultConnectionFactory$.clusterBuilder(CassandraConnectionFactory.scala:35)
    at com.datastax.spark.connector.cql.DefaultConnectionFactory$.createCluster(CassandraConnectionFactory.scala:87)
    at com.datastax.spark.connector.cql.CassandraConnector$.com$datastax$spark$connector$cql$CassandraConnector$$createSession(CassandraConnector.scala:153)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148)
    at com.datastax.spark.connector.cql.CassandraConnector$$anonfun$2.apply(CassandraConnector.scala:148)
    at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:31)

There is a known conflict between the Guava version shipped with the connector, and the one shipped with spark. 连接器随附的Guava版本与Spark随附的版本之间存在已知冲突。 How did you try to shade the Guava library? 您如何尝试对Guava库进行着色?

Try to add this to your build.sbt file: 尝试将其添加到您的build.sbt文件中:

assemblyShadeRules in assembly := Seq(
  ShadeRule.rename("com.google.**" -> "shadeio.@1").inAll
)

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM