简体   繁体   中英

How to read large .txt file in chunks of 1000 lines

I want to read and process 1000 line chunks from a file repeatedly till file end.

Path pp = FileSystems.getDefault().getPath("logs", "access.log");
final int BUFFER_SIZE = 1024*1024; //this is actually bytes

FileInputStream fis = new FileInputStream(pp.toFile());
byte[] buffer = new byte[BUFFER_SIZE]; 
int read = 0;
while( ( read = fis.read( buffer ) ) > 0 ){
    // call your other methodes here...
}

fis.close();

For years I faced the same situation. My final solution was using the method .sublist() from interface List, which you can use:

First step: read all lines from given file

 String textfileRow = null;
 List<String> fileLines = new ArrayList<String>();
 BufferedReader fileContentBuffer = null;
    fileContentBuffer = new BufferedReader(new FileReader(<your file>));
    while ((textfileRow = fileContentBuffer.readLine()) != null)
    {
       fileLines.add(textfileRow);
    }

Second step: Creating chunks from previously created list in your given size

    int final CHUNKSIZE = <your needed chunk size>;
    int lineIndex = 0;
    while (lineIndex < fileLines.size())
    {
        int chunkEnd = lineIndex + CHUNKSIZE;
    
        if (chunkEnd >= fileLines.size())
        {
            chunkEnd = fileLines.size();
        }
        List<Type you need> mySubList = fileLines.subList(lineIndex, chunkEnd);
                
        //What ever you want do to...       
                
        lineIndex = chunkEnd;
    }

In my project, I use it with csv files up to 20k lines and it works well.

Edit: I saw in the headline there is a request for text files, so I changed the way to read text file.

Old method: use a BufferedReader 's readLine method instead of a raw FileInputStream .

 Path path = // access your path...;
 List<String> buffer = new ArrayList<>();
 try (BufferedReader in = new BufferedReader(new FileReader(path.toFile))) {
    String nextLine;
    do  {
        buffer.clear();
        for (int i=0; i < chunkSize; i++) {
            // note that in.readLine() returns null at end-of-file
            if ((nextLine = in.readLine()) == null) break;
            buffer.add(next);
        }
        processChunk(buffer); // note size may be less than desiredChunkSize  
    } while (nextLine != null);
 } catch (IOException ioe) {
    // handle exceptions here
 }
 // all resources will be automatically closed before this line is reached

Newer method: use Files.lines to access a lazily-populated stream of lines:

 Path path = // access your path...;
 final AtomicInteger c = new AtomicInteger();
 Files.lines(path)
      .collect(Collectors.groupingBy(e -> c.getAndIncrement()/chunkSize))
      .forEach(chunk -> processChunk(chunk));
 // all resources will be automatically closed before this line is reached

Disclaimer: I have not tested either; but both approaches should work.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM