简体   繁体   中英

Connection pooling in Spark Java framework

Java Spark framework uses embedded Jetty as as web server. Jetty supports connection pooling with tools such as HikariCP and provides configuration options in XML files. However, according to these posts , Spark does allow to configure Jetty. There are plenty of examples using Spark, but they either do not use database or use DriverManager to connect to the database.

Is it possible to configure connection pooling via datasource and JNDI in Spark? If so, how?

I configured pooling in Spark Java with HikariCP for MariaDB. I did not use Jetty and used Apache Tomcat instead. Here are some code snippets:

src/main/resources/mysql-connection.properties

dataSourceClassName=com.mysql.jdbc.jdbc2.optional.MysqlDataSource
dataSource.url=<url>
dataSource.user=<user>
dataSource.password=<password>
dataSource.cachePrepStmts=true
dataSource.prepStmtCacheSize=100
dataSource.prepStmtCacheSqlLimit=2048
dataSource.useServerPrepStmts=true
maximumPoolSize=10

com/example/app/DataSourceFactory.java

public final class DataSourceFactory {

    static final Logger LOG = LoggerFactory.getLogger(DataSourceFactory.class);

    private static DataSource mySQLDataSource;

    private DataSourceFactory() {
    }

    //returns javax.sql.DataSource
    public static DataSource getMySQLDataSource() {
        if (mySQLDataSource == null) {
            synchronized (DataSourceFactory.class) {
                if (mySQLDataSource == null) {
                    mySQLDataSource = getDataSource("mysql-connection.properties");
                }
            }
        }
        return mySQLDataSource;
    }

    // method to create the DataSource based on configuration
    private static DataSource getDataSource(String configurationProperties) {
        Properties conf = new Properties();
        try {
            conf.load(DataSourceFactory.class.getClassLoader().getResourceAsStream(configurationProperties));
        } catch (IOException e) {
            LOG.error("Can't locate database configuration", e);
        }
        HikariConfig config = new HikariConfig(conf);
        HikariDataSource dataSource = new HikariDataSource(config);
        LOG.info("DataSource[" + configurationProperties + "] created " + dataSource);
        return dataSource;
    }

}

WebContent/WEB-INF/web.xml

<?xml version="1.0" encoding="UTF-8"?>
<web-app xmlns="http://xmlns.jcp.org/xml/ns/javaee" 
    xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
    xsi:schemaLocation="http://xmlns.jcp.org/xml/ns/javaee                              
    http://xmlns.jcp.org/xml/ns/javaee/web-app_3_1.xsd" version="3.1">

    <display-name>My Spark App</display-name>
    <filter>
        <filter-name>SparkFilter</filter-name>
        <filter-class>spark.servlet.SparkFilter</filter-class>
        <init-param>
            <param-name>applicationClass</param-name>
            <param-value>com.example.app.MySparkApp</param-value>
            <!-- MySparkApp implements spark.servlet.SparkApplication -->
        </init-param>
    </filter>
    <filter-mapping>
        <filter-name>SparkFilter</filter-name>
        <url-pattern>/*</url-pattern>
    </filter-mapping>
    <listener>
        <listener-class>com.example.app.AppServletContextListener</listener-class>
    </listener>
</web-app>

com/example/app/AppServletContextListener.java

public class AppServletContextListener implements ServletContextListener {

    static final Logger LOG = LoggerFactory.getLogger(AppServletContextListener.class);

    @Override
    public void contextInitialized(ServletContextEvent arg0) {
        LOG.info("contextInitialized...");
    }

    @Override
    public void contextDestroyed(ServletContextEvent arg0) {
        LOG.info("contextDestroyed...");
        try {
            if (DataSourceFactory.getMySQLDataSource() != null) {
                DataSourceFactory.getMySQLDataSource().unwrap(HikariDataSource.class).close();
            }

        } catch (SQLException e) {
            LOG.error("Problem closing HikariCP pool", e);
        }

    }

}

And finally you can obtain a pooled java.sql.Connection by calling DataSourceFactory.getMySQLDataSource().getConnection()

Should be possible, but you must:

  1. configure the pool
  2. keep a static reference to the datasource
  3. find where to start/connect to the database and more important find a way to close/release the pool

Point 1) and 2) could be as simple as:

static final HikariDataSource datasource = new HikariDataSource(new HikariConfig());

Don't know if Sparkjava supports start/stop events. I can't help you there.

Optionally, you could try http://jooby.org (I'm the author) with the jdbc module :

{
  use(new Jdbc());

  get("/db", () -> {
    DataSource ds = require(DataSource.class);
  });
}

Jdbc module give you a HikariDatasource with sensible defaults options, but at the same time you can customize the pool via application.conf file.

There are a couple of high level jdbc modules too, like Hibernate, Ebean, jOOQ, Jdbi, etc...

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM