简体   繁体   中英

Best way to read an input stream to a buffer

For reading any input stream to a buffer there are two methods. Can someone help me understand which is the better method and why? And in which situation we should use each method?

  • Reading line by line and appending it to the buffer.

Eg:

public String fileToBuffer(InputStream is, StringBuffer strBuffer) throws IOException{
    StringBuffer buffer = strBuffer;
    InputStreamReader isr = null;

    try {
        isr = new InputStreamReader(is);
        BufferedReader br = new BufferedReader(isr);
        String line = null;

        while ((line = br.readLine()) != null) {
            buffer.append(line + "\n");
        }
    } finally {
        if (is != null) {
            is.close();
        }
        if (isr != null) {
            isr.close();
        }
    }

    return buffer.toString();
}

  • Reading up to buffer size ie 1024 bytes in a char array.

Eg:

InputStreamReader isr = new InputStreamReader(is);
final int bufferSize = 1024;
char[] buffer = new char[bufferSize];
StringBuffer strBuffer = new StringBuffer();

/* read the base script into string buffer */
try {
    while (true) {
        int read = isr.read(buffer, 0, bufferSize);
        if (read == -1) {
           break;
        }

        strBuffer.append(buffer, 0, read);
     }
} catch (IOException e) {
}

Consider

public String fileToBuffer(InputStream is, StringBuffer strBuffer) throws IOException {
    StringBuilder sb = new StringBuilder(strBuffer);
    try (BufferedReader rdr = new BufferedReader(new InputStreamReader(is))) { 
        for (int c; (c = rdr.read()) != -1;) {
            sb.append((char) c);

        }
    }
    return sb.toString();
}

Depends on the purpose.

For work with text files read lines (if you need them).

For work with raw binary data use chunks of bytes.

In you examples chunks of bytes are more robust. What if a line is too long and breaks some of intermediate objects?

If your file is binary, do you know how big a line will be? May be the size of file.

Trying to "swallow" too big String may cause ErrorOutOfMemory. With 1024 bytes it (ok - almost) never happens.

Chunking by 1024 bytes may take longer, but its more reliable.

Using 'readLine' isn't so neat. The asker's method 2 is quite standard, but the below method is unique (and likely better):

//read the whole inputstream and put into a string
public String inputstream2str(InputStream stream) {
  Scanner s = new Scanner(stream).useDelimiter("\\A");
  return s.hasNext()? s.next():"";
}

From a String you can convert to byte array or whatever buffer you want.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM