[英]No suitable driver found for jdbc in Spark
I am using我在用
df.write.mode("append").jdbc("jdbc:mysql://ip:port/database", "table_name", properties)
to insert into a table in MySQL.插入到 MySQL 中的表中。
Also, I have added Class.forName("com.mysql.jdbc.Driver")
in my code.另外,我在我的代码中添加了
Class.forName("com.mysql.jdbc.Driver")
。
When I submit my Spark application:当我提交我的 Spark 申请时:
spark-submit --class MY_MAIN_CLASS
--master yarn-client
--jars /path/to/mysql-connector-java-5.0.8-bin.jar
--driver-class-path /path/to/mysql-connector-java-5.0.8-bin.jar
MY_APPLICATION.jar
This yarn-client mode works for me.这种 yarn-client 模式适合我。
But when I use yarn-cluster mode:但是当我使用 yarn-cluster 模式时:
spark-submit --class MY_MAIN_CLASS
--master yarn-cluster
--jars /path/to/mysql-connector-java-5.0.8-bin.jar
--driver-class-path /path/to/mysql-connector-java-5.0.8-bin.jar
MY_APPLICATION.jar
It doens't work.它不起作用。 I also tried setting "--conf":
我也尝试设置“--conf”:
spark-submit --class MY_MAIN_CLASS
--master yarn-cluster
--jars /path/to/mysql-connector-java-5.0.8-bin.jar
--driver-class-path /path/to/mysql-connector-java-5.0.8-bin.jar
--conf spark.executor.extraClassPath=/path/to/mysql-connector-java-5.0.8-bin.jar
MY_APPLICATION.jar
but still get the "No suitable driver found for jdbc" error.但仍然出现“找不到适合 jdbc 的驱动程序”错误。
I had to add the driver
option when using the sparkSession
's read
function.使用
sparkSession
的read
功能时,我必须添加driver
选项。
.option("driver", "org.postgresql.Driver")
var jdbcDF - sparkSession.read
.option("driver", "org.postgresql.Driver")
.option("url", "jdbc:postgresql://<host>:<port>/<DBName>")
.option("dbtable", "<tableName>")
.option("user", "<user>")
.option("password", "<password>")
.load()
Depending on how your dependencies are setup, you'll notice that when you include something like compile group: 'org.postgresql', name: 'postgresql', version: '42.2.8'
in Gradle, for example, this will include the Driver class at org/postgresql/Driver.class
, and that's the one you want to instruct spark to load.根据您的依赖项的设置方式,您会注意到,当您在 Gradle 中包含类似
compile group: 'org.postgresql', name: 'postgresql', version: '42.2.8'
的内容时,例如,这将包括org/postgresql/Driver.class
中的驱动程序类,这就是您要指示 spark 加载的驱动程序类。
There is 3 possible solutions,有3种可能的解决方案,
spark-submit
cli.spark-submit
cli 中添加依赖项。 You can use the following option in your spark-submit
cli:您可以在
spark-submit
cli 中使用以下选项:
--jars $(echo./lib/*.jar | tr ' ' ',')
Explanation: Supposing that you have all your jars in a lib
directory in your project root, this will read all the libraries and add them to the application submit.说明:假设您的所有 jar 都在项目根目录的
lib
目录中,这将读取所有库并将它们添加到应用程序提交中。
You can also try to configure these 2 variables: spark.driver.extraClassPath
and spark.executor.extraClassPath
in SPARK_HOME/conf/spark-default.conf
file and specify the value of these variables as the path of the jar file.您也可以尝试在
SPARK_HOME/conf/spark-default.conf
文件中配置这两个变量: spark.driver.extraClassPath
和spark.executor.extraClassPath
并将这两个变量的值指定为 jar 文件的路径。 Ensure that the same path exists on worker nodes.确保工作程序节点上存在相同的路径。
I tried the suggestions shown here which didn't work for me (with mysql).我尝试了此处显示的建议,但对我不起作用(使用 mysql)。 While debugging through the DriverManager code, I realized that I needed to register my driver since this was not happening automatically with "spark-submit".
在通过 DriverManager 代码进行调试时,我意识到我需要注册我的驱动程序,因为这不会通过“spark-submit”自动发生。 I therefore added
因此我补充说
Driver driver = new Driver();
The constructor registers the driver with the DriverManager, which solved the SQLException problem for me.构造函数向 DriverManager 注册驱动程序,这为我解决了 SQLException 问题。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.