简体   繁体   中英

Passing jdbc connection to spark read

I have an sql script which creates temp tables valid for only that session. Now after running the script, I am trying to read data from the table through spark and then process it. Below is the code I have code for spark read.

sparkSession.read().format("jdbc").option("url", 
jdbcURL).option("dbtable", tableOrQuery).option("user", 
userName).option("password", password)
      .option("driver", driverName).load();

Now I need to pass the jdbc connection I created so that spark can read data in the same session. Is this possible ?

No, you cannot pass jdbc connection to spark. It will manage JDBC connection by itself.

JdbcRelationProvider Create Connection

JdbcUtils connect

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM