Need to retrieve 200000 records from a table and do processing on each record. The database is oracle. Currently Using fetch_rowarrayref method and doing processing on each record. For huge amount of record is it efficient to have a fetch limit like 5000 records and looping. Mysql has a LIMIT keyword but oracle doesn't have it. Not sure how in dbi i can do it.
Fetch 5000 records into a array Do the processing from the array Fetch again till it reached 100000 records
Using pagination will not be more efficient than what you are doing. The point of pagination would be to avoid running out of memory, but if you are not (and Oracle should not with DBD::Oracle) then there is nothing gained by that.
If this operation is too slow, then you have several basic options.
Your task looks like pagination in fetching data and processing on it, to write a pagination type query
select *
from (
select /*+ first_rows(25) */
your_columns,
row_number()
over (order by something unique)rn
from your_tables )
where rn between :n and :m
order by rn;
:n = Starting Row :m = End Row rn = Column to Perform Sorting
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.