简体   繁体   中英

Cannot build a scala program “sbt package” failed with Exception in thread “main” java.sql.SQLException: No suitable driver

I created the following test Scala program using IntelliJ (sbt project).

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import org.apache.spark.SparkConf
import java.sql._

object ConnTest extends App {
  val conf = new SparkConf()
  val sc = new SparkContext(conf.setAppName("Test").setMaster("local[*]"))
  val sqlContext = new org.apache.spark.sql.SQLContext(sc)
  val jdbcSqlConn = "jdbc:sqlserver://...;databaseName=...;user=...;password=...;"
  val jdbcDf = sqlContext.read.format("jdbc").options(Map(
    "url" -> jdbcSqlConn,
    "dbtable" -> "table1"
  )).load()
  jdbcDf.show(10)

  sc.stop()
}

However, sbt package got the following error. I've downloaded MS Sql server driver ( C:\\sqljdbc_6.0\\enu\\jre8\\sqljdbc42.jar ) from Microsoft website. How to set the jar reference in the sbt project?

Exception in thread "main" java.sql.SQLException: No suitable driver
        at java.sql.DriverManager.getDriver(Unknown Source)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84)
        at scala.Option.getOrElse(Option.scala:121)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.(JDBCOptions.scala:83)
        at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.(JDBCOptions.scala:34)
        at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
        at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:330)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:152)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:125)
        at ConnTest$.delayedEndpoint$ConnTest$1(main.scala:14)
        at ConnTest$delayedInit$body.apply(main.scala:6)
        at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
        at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.App$$anonfun$main$1.apply(App.scala:76)
        at scala.collection.immutable.List.foreach(List.scala:381)
        at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
        at scala.App$class.main(App.scala:76)
        at ConnTest$.main(main.scala:6)
        at ConnTest.main(main.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:743)
        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

put any external jars in lib folder , create lib if does not exist( mkdir -p lib ).

build.sbt
lib/
  sqljdbc42.jar
project/
src/

Another way could be publishing the jar to your ivy repo ( ~/.ivy )

Then you can simply go to sbt console and verify that the jar is loaded.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM