简体   繁体   中英

elegant way to read large data with FilterInputStream object

I am getting a FilterInputStream object as a return type from a function. Now the file which I will be getting as an stream is a log file. So I think it can be big file. So I do not want to read the data all at once. But reading data in a loop is kind of tedious job.

I need splitting at every newline, meaning data in file is in line separated format. With a constant size byte array to be used in public int read(byte[], int off, int len) as it will give rise to many cases. I do not want to read it at once because it can be of large size.

Is there an elegant way to do this.

PS: I am in particularly referring to S3ObjectInputStream extended from FilterInputStream which has read() function.

BufferedReader包裹在FilterInputStreamInputStreamReader周围,​​并调用readLine().

OK, I am sorry, the next floor is right, you can use BufferedReader class, it has a method called readLine,and it returns a String object instead of a byte array. Just like this

BufferedReader reader = new BufferedReader(new FileReader(new File("the file path")));
String date = reader.readLine();
if(!StringUtil.isBlank(date)){
    //reade the file line by line
    date = reader.readLine();
}
reader.close();

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM