[英]sbt unresolved dependency for spark streaming Kafka integration
我想使用Kafka集成进行Spark流式传输。 我使用Spark版本2.0.0。
但是我收到一个未解决的依赖项错误(“未解决的依赖项:org.apache.spark#spark-sql-kafka-0-10_2.11; 2.0.0:找不到”。
如何访问此软件包? 还是我做错了什么/丢失了?
我的build.sbt文件:
name := "Spark Streaming"
version := "0.1"
scalaVersion := "2.11.11"
val sparkVersion = "2.0.0"
libraryDependencies ++= Seq(
"org.apache.spark" %% "spark-core" % sparkVersion,
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.apache.spark" %% "spark-streaming" % sparkVersion,
"org.apache.spark" %% "spark-sql-kafka-0-10" % sparkVersion
)
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "2.0.0-preview"
谢谢你的帮助。
https://mvnrepository.com/artifact/org.apache.spark/spark-streaming-kafka-0-10_2.11
libraryDependencies += "org.apache.spark" % "spark-streaming-kafka-0-10_2.11" % "2.0.0"
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.