What I did was trying to download large data zip file from s3 bucket
S3ObjectInputStream inputStreams = s3object.getObjectContent();
File newFile = new File(zipFileTempLocation + File.separator + CommonConstant.FILE_NAME);
FileOutputStream fileOutputStream = new FileOutputStream(newFile);
GZIPInputStream gzipInputStream = new GZIPInputStream(inputStream);
LOGGER.info("staring to write {}", newFile.toPath());
byte[] buffer = new byte[5000];
int le
while ((len = gzipInputStream.read(buffer)) > 0) {
fileOutputStream.write(buffer, 0, len);
}
gzipInputStream.close();
String newFileURl = newFile.getAbsolutePath();
Path path = Paths.get(url);
return Files.readAllBytes(path);
}
when I trying to run my service it says out of heap memory error. Can you help me with this?
You can have a look at this answer and stream it so it's not loaded into memory.
@GetMapping(value = "/downloadfile/**", produces = { MediaType.APPLICATION_OCTET_STREAM_VALUE })
public ResponseEntity<S3ObjectInputStream> downloadFile(HttpServletRequest request) {
//reads the content from S3 bucket and returns a S3ObjectInputStream
S3Object object = publishAmazonS3.getObject("12345bucket", "/logs/file1.log");
S3ObjectInputStream finalObject = object.getObjectContent();
final StreamingResponseBody body = outputStream -> {
int numberOfBytesToWrite = 0;
byte[] data = new byte[1024];
while ((numberOfBytesToWrite = finalObject.read(data, 0, data.length)) != -1) {
System.out.println("Writing some bytes..");
outputStream.write(data, 0, numberOfBytesToWrite);
}
finalObject.close();
};
return new ResponseEntity<>(body, HttpStatus.OK);
}
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.