简体   繁体   中英

Build.sbt breaks when adding GraphFrames build with scala 2.11

I'm trying to add GraphFrames to my scala spark application, and this was going fine when I added the one based on 2.10. However, as soon as I tried to build it with GraphFrames build with scala 2.11, it breaks.

The problem would be that there are conflicting versions of scala used (2.10 and 2.11). I'm getting the following error:

[error] Modules were resolved with conflicting cross-version suffixes in {file:/E:/Documents/School/LSDE/hadoopcryptoledger/examples/scala-spark-graphx-bitcointransaction/}root:
[error]    org.apache.spark:spark-launcher _2.10, _2.11
[error]    org.json4s:json4s-ast _2.10, _2.11
[error]    org.apache.spark:spark-network-shuffle _2.10, _2.11
[error]    com.twitter:chill _2.10, _2.11
[error]    org.json4s:json4s-jackson _2.10, _2.11
[error]    com.fasterxml.jackson.module:jackson-module-scala _2.10, _2.11
[error]    org.json4s:json4s-core _2.10, _2.11
[error]    org.apache.spark:spark-unsafe _2.10, _2.11
[error]    org.apache.spark:spark-core _2.10, _2.11
[error]    org.apache.spark:spark-network-common _2.10, _2.11

However, I can't troubleshoot what causes this.. This is my full build.sbt:

import sbt._
import Keys._
import scala._


lazy val root = (project in file("."))
.settings(
    name := "example-hcl-spark-scala-graphx-bitcointransaction",
    version := "0.1"
)
 .configs( IntegrationTest )
  .settings( Defaults.itSettings : _*)

scalacOptions += "-target:jvm-1.7"

crossScalaVersions := Seq("2.11.8")

resolvers += Resolver.mavenLocal

fork  := true

jacoco.settings

itJacoco.settings



assemblyJarName in assembly := "example-hcl-spark-scala-graphx-bitcointransaction.jar"

libraryDependencies += "com.github.zuinnote" % "hadoopcryptoledger-fileformat" % "1.0.7" % "compile"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"

libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.5.0" % "provided"

libraryDependencies += "org.apache.hadoop" % "hadoop-client" % "2.7.0" % "provided"

libraryDependencies += "javax.servlet" % "javax.servlet-api" % "3.0.1" % "it"


libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.7.0" % "it" classifier "" classifier "tests"

libraryDependencies += "org.apache.hadoop" % "hadoop-hdfs" % "2.7.0" % "it" classifier "" classifier "tests"

libraryDependencies += "org.apache.hadoop" % "hadoop-minicluster" % "2.7.0" % "it"

libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.2.0" % "provided"

libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test,it"

libraryDependencies += "graphframes" % "graphframes" % "0.5.0-spark2.1-s_2.11"

Can anyone pinpoint which dependency is based on scala 2.10 causing the build to fail?

I found out what the problem was. Apparently, if you use:

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.5.0" % "provided"

It uses the 2.10 version by default. It all worked once I changed the dependencies of spark core and spark graphx to:

libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "2.2.0"

libraryDependencies += "org.apache.spark" % "spark-graphx_2.11" % "2.2.0" % "provided"

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM