简体   繁体   中英

Facing performance issue converting raw data from byte array of http request

We are using below code to extract raw data from http request and its taking quite a long time. Also CPU utilization peeks during this time. Request header has an XML with close to 4000-5000 characters. Is there any way we can re-write below code to save on time and utilization?

private byte[] getRequestBytes(HttpServletRequest request) throws IOException {
    byte[] requestBytes = null;
    byte[] streamBytes = new byte[1];
    InputStream stream = request.getInputStream();
    int length = 0;
    ByteArrayOutputStream arrayOutputStream = new ByteArrayOutputStream();
    while((length = stream.read(streamBytes,0,1)) != -1) {
        arrayOutputStream.write(streamBytes);
    }
    requestBytes = arrayOutputStream.toByteArray();
    return requestBytes;
}

Java version is 1.7u45

Here are some issues with the code :

  1. byte[] streamBytes = new byte[1]; this buffer is too small use something like 4096.

  2. you are not closing your stream ,which may lead to resource leak.

  3. stream.read(streamBytes,0,1) you are reading only on one byte per loop execution , which leads to poor performance.

  4. use of length variable is redundant you could just do stream.read(streamBytes,0,1) != -1

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM