簡體   English   中英

亞馬遜 sdk 從 spring 啟動 Z93F725A074233D1C889F4F48B 中的 s3 存儲桶下載大型 zip 文件

[英]Amazon sdk dowload large zip file from s3 bucket in spring boot java

我所做的是嘗試從 s3 存儲桶下載大數據 zip 文件

        S3ObjectInputStream inputStreams = s3object.getObjectContent(); 
        File newFile = new File(zipFileTempLocation + File.separator + CommonConstant.FILE_NAME);
        FileOutputStream fileOutputStream = new FileOutputStream(newFile);
        GZIPInputStream gzipInputStream = new GZIPInputStream(inputStream);
        LOGGER.info("staring to write {}", newFile.toPath());
        byte[] buffer = new byte[5000];
        int le
        while ((len = gzipInputStream.read(buffer)) > 0) {
            fileOutputStream.write(buffer, 0, len);
        }
        gzipInputStream.close();
        String newFileURl = newFile.getAbsolutePath();
        Path path = Paths.get(url);
        return Files.readAllBytes(path);

}

當我嘗試運行我的服務時,它說堆外 memory 錯誤。 你能幫我解決這個問題嗎?

您可以查看答案和 stream 它,因此它不會加載到 memory 中。

  @GetMapping(value = "/downloadfile/**", produces = { MediaType.APPLICATION_OCTET_STREAM_VALUE })
    public ResponseEntity<S3ObjectInputStream> downloadFile(HttpServletRequest request) {
       //reads the content from S3 bucket and returns a S3ObjectInputStream
       S3Object object = publishAmazonS3.getObject("12345bucket", "/logs/file1.log");
       S3ObjectInputStream finalObject = object.getObjectContent();

        final StreamingResponseBody body = outputStream -> {
            int numberOfBytesToWrite = 0;
            byte[] data = new byte[1024];
            while ((numberOfBytesToWrite = finalObject.read(data, 0, data.length)) != -1) {
                System.out.println("Writing some bytes..");
                outputStream.write(data, 0, numberOfBytesToWrite);
            }
            finalObject.close();
        };
        return new ResponseEntity<>(body, HttpStatus.OK);
    }

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM