简体   繁体   中英

Handling large records in a Java EE application

There is a table phonenumbers with two columns: id , and number . There are about half a million entries in the table. Database is MySQL .

The requirement is to develop a simple Java EE application, connected to that database, that allows a user to download all number values in comma separated style by following a specific URL.

If we get all the values in a huge String array and then concatenate them (with comma in between all the values) in a String and then send it down to the user, does it sound a proper solution?

The application is not public and will be used by a limited no. of people.

Your best bet is to not store the data in Java's memory in any way, but just write the obtained data to the response immediately as the data comes in. You also need to configure the MySQL JDBC driver to serve the resultset row-by-row by Statement#setFetchSize() as per the MySQL JDBC driver documentation , otherwise it will cache the whole thing in memory.

Assuming you're familiar with Servlets, here's a kickoff example which takes that all into account:

protected void doGet(HttpServletRequest request, HttpServletResponse response) throws ServletException, IOException {
    response.setContentType("text/plain");
    response.setHeader("Content-Disposition", "attachment;filename=numbers.txt"); // Force download popup.

    Connection connection = null;
    Statement statement = null;
    ResultSet resultSet = null;
    Writer writer = response.getWriter();

    try {
        connection = database.getConnection();
        statement = connection.createStatement(ResultSet.TYPE_FORWARD_ONLY, ResultSet.CONCUR_READ_ONLY);
        statement.setFetchSize(Integer.MIN_VALUE);
        resultSet = statement.executeQuery("SELECT number FROM phonenumbers");

        while (resultSet.next()) {
            writer.write(resultSet.getString("number"));
            if (!resultSet.isLast()) {
                writer.write(",");
            }
        }
    } catch (SQLException e) {
        throw new ServletException("Query failed!", e);
    } finally { 
        if (resultSet != null) try { resultSet.close; } catch (SQLException logOrIgnore) {}
        if (statement != null) try { statement.close; } catch (SQLException logOrIgnore) {}
        if (connection != null) try { connection.close; } catch (SQLException logOrIgnore) {}
    }
}

There's a bit more to properly formatting CSV output. It would be easiest to use an existing library such as this one to generate the output file.

You can generate output to a file on disk (on the web server) and then redirect the browser to that file (with a cron job or whatever to clean up old data) or just stream the result directly back to the user.

If you are streaming directly be sure and set the MIME type to something that will trigger a download in the user's browser (eg text/csv or text/comma-separated-values)

If using Mysql 5.1+, I would simply use the proprietary syntax to dump the file somewhere and stream it in a Servlet response.

SELECT a,b,a+b INTO OUTFILE '/tmp/result.txt'
  FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
  LINES TERMINATED BY '\n'
  FROM test_table;

http://dev.mysql.com/doc/refman/5.1/en/select.html

For so many records, if you still want to use JDBC, you may try the following:

  • fetch the number of records fetch few records( using a query limit ) and write them
  • if you reach the number of records in a chunk, you fetch another one until you reach the maximum number of records

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM