繁体   English   中英

sbt-cassandra-connector 2.0.2的sbt未解决依赖性

[英]sbt unresolved dependency for spark-cassandra-connector 2.0.2

build.sbt:

val sparkVersion = "2.1.1";

libraryDependencies += "org.apache.spark" %% "spark-core" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-sql" % sparkVersion % "provided";
libraryDependencies += "org.apache.spark" %% "spark-streaming" % sparkVersion % "provided";

libraryDependencies += "com.datastax.spark" % "spark-cassandra-connector" % "2.0.2";
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % sparkVersion;

输出:

[error] (myproject/*:update) sbt.ResolveException: unresolved dependency: com.datastax.spark#spark-cassandra-connector;2.0.2: not found

任何想法? 我是sbt和spark的新手。 谢谢

这是由于"com.datastax.spark" % "spark-cassandra-connector" % "2.0.2"; 没有scala版本 ,请参阅maven repo

http://search.maven.org/#artifactdetails%7Ccom.datastax.spark%7Cspark-cassandra-connector_2.11%7C2.0.2%7Cjar

有两种解决方案:

  1. "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.2"明确设置了Scala版本的依赖关系
  2. "com.datastax.spark" %% "spark-cassandra-connector" % "2.0.2" ,将%%工件ID一起使用,这样, SBT将自动基于项目的scala版本扩展到解决方案1

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM