简体   繁体   中英

How to read huge data from database and write on file in java?

I have a derby data base with atleast 2 GB data and I have to read this data and write into the text file. Now, some of the column may be of blob data type.

My present approach is to read data from one table at a time in a batch size of 10(let's say) and putting the data (list of strings) in a array blocking queue of maximum size of 10. From queue, a thread will pick element one by one and write on the file.

I am facing following problems,

  1. how to fetch 10 rows from a table in a single hit and fetch next 10 rows in second
  hit and so on.
  2. how to convert blob and binary data into string 

There is no way for you to do this explicitly. But you can set the fetchSize. This parameter is a hint to the JDBC driver as to many rows to fetch in one go from the database. But the driver is free to ignore this and do what it sees fit. Some drivers fetch rows in chunks and some just read in the whole result set in one go.

Using fetch size (depends on driver) will buffer data, so no issue in terms of performance as it implicitly does the chunking for you. So basically no difference in the way we get data from resultset.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM