简体   繁体   中英

Nats Spark connector: Error: Failed to load class

good afternoon!

I'm newbie in nats/spark thing and I've been stuck for a few days. Would be greatfull for any tip. I'm using the https://github.com/Logimethods/nats-connector-spark-scala connector to read messages from a nats server. I'm using Intellij with SBT to run it and it works. Instead when I'm trying to build de jar file fails:

在此处输入图片说明

I've checked if the jar file has the MANIFEST.MF:

在此处输入图片说明

I'm thinking that I'm missing maybe some dependecy or incompatibility issues is given, so I'll attach my buildd.sbt file:

name := "brokerNatsSparkSBT"
version := "0.1"
scalaVersion := "2.11.12"

resolvers += "Sonatype OSS Snapshots" at "https://oss.sonatype.org/content/repositories/snapshots"
resolvers += "Sonatype OSS Release" at "https://oss.sonatype.org/content/groups/public/"

libraryDependencies += "com.logimethods" % "nats-connector-spark-scala_2.11" % "1.0.0"

val sparkVersion = "2.3.1"

libraryDependencies ++= Seq(
  "org.apache.spark" %% "spark-core" % sparkVersion,
  "org.apache.spark" %% "spark-streaming" % sparkVersion
)

Using JDK 1.8, SBT according to build.properties 1.5.4.

Thanks in advance!

After a few days of struggling in the end I made it thanks to this article . Inluding the sbt-assembly plugin and building with it the jar file I achived to build correctly de jar .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM