We have a stored procedure that is used to select a million records. The stored procedure looks something like below
CREATE PROCEDURE ABC(@CustID varchar(20))
AS
BEGIN
WITH temporaryTable as (SELECT ab,bc,... FROM Table); // a very huge select query
SELECT a,b, ..., l FROM temporaryTable OUTER JOIN(some_table1)... union all SELECT m,n,..., z FROM temporaryTable OUTER JOIN(some_table2); // two very huge select query with union
END
I have been tasked to fetch the result using that stored procedure and write it in a CSV file using java and spring boot.
I have tried the spring-data-jpa's @NamedStoredProcedureQuery way for fetching the result and write it into CSV using opencsv but this is too slow. I did use setFirstResult() & setMaxResult() methods but couldn't see any difference( maybe this requires stored procedure to be configured differently, not sure though. )
Right now I am trying to use Spring Batch's StoredProcedureItemReader (still configuring...) for reading the data and FlatFileItemWriter to write into CSV, but I am new to this and not sure whether this will help or not though I think that this link( https://docs.spring.io/spring-batch/4.1.x/reference/html/scalability.html#scalability ) might help.
What I need is a direction where I can get the desired result.
Thanks and as always any helpful gesture from the community will be highly appreciated!!!
I would instead just use a Reader->Writer step with JdbcCursorItemReader
for your Reader and FlatFileItemWriter
for your Writer.
The Reader's sql would simply be your SELECT
from your question. And the Writer would use DelimitedLineAggregator
to create your CSV output.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.