[英]Spark 2.1.0 error - kafka.cluster.BrokerEndPoint cannot be cast to kafka.cluster.Broker
I have seen several other questions on SO about this that indicate this is a dependency/version issue, eg kafka.cluster.BrokerEndPoint cannot be cast to kafka.cluster.Broker 我在SO上还看到了其他几个问题,这些问题表明这是依赖项/版本问题,例如kafka.cluster.BrokerEndPoint无法转换为kafka.cluster.Broker
However, I can't tell what is wrong with my dependencies. 但是,我无法确定我的依赖项出了什么问题。 I'm using Spark 2.1.0, and
我正在使用Spark 2.1.0,并且
org.apache.spark.spark-core_2.11
org.apache.spark.spark-sql_2.11
org.apache.spark.spark-streaming-kafka-0-8_2.11
org.apache.spark.spark-streaming_2.11
as dependencies in my job jar I am trying to run via spark-submit
. 作为工作罐中的依赖项,我试图通过
spark-submit
运行。 I know that the broker version for the Kafka server is 0.8, and these depedences are all set to 2.1.0. 我知道Kafka服务器的代理版本为0.8,而这些限制都设置为2.1.0。 All of these dependencies seem compatible, so I am not sure where my error is.
所有这些依赖项似乎都是兼容的,所以我不确定我的错误在哪里。
Edit: I have discovered that if I change the spark-streaming-kafka-0-8_2.11
to be a provided dependency and pass it when I call spark-submit
, it works. 编辑:我发现,如果我将
spark-streaming-kafka-0-8_2.11
更改为提供的依赖项,并在调用spark-submit
时将其传递,则它可以工作。 I am not sure why this is and why I can't just bundle it in the main jar with everything else. 我不确定为什么会这样,为什么我不能将它与其他所有东西捆绑在主罐中。
检查以确保在运行时类路径中的某处没有任何可能被加载的Kafka jar。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.