I call a rest service which returns an ID of an object in the first call. It gives me the length in bytes in addition to the ID of the object to read.
However, the CSV data returned could be up to 200,000 lines long. This causes an out of memory exception due to reading all 200,000 thousand lines from a rest call into memory in a Java program.
The Map
returned is a CSV structure that the spring template returns. The problem is that when the equivalent CSV file is over 200,000 lines, the LinkedList
can't hold it, so I get an out of memory exception.
Is there any other way to manipulate the RestTemplate
API to process the data as a Stream
that can be read gradually?
UriComponentsBuilder builder = UriComponentsBuilder.newInstance();
builder.scheme("https).pathSegment("/file-object").host(hostName);
// Call the third-party Rest API
ResponseEntity<?> responseEntity =
restTemplate.exchange(
builder.queryParam("fileId", "fjr666eH").build().toUri(),
HttpMethod.GET,
new HttpEntity(httpAuthorizationHeaders("UID","XFCode:66w000")),
Object.class
);
// CSV file is returned by Spring as a linked list. But since data is
// large I get an OOM exception
LinkedHashMap<String, Object> map =
(LinkedHashMap<String, Object>) responseEntity.getBody();
I was able to use a Streaming feature provided by Spring Rest API. I used the SimpleClientHttpRequestFactory, BufferedReader and was able to set the chunk size to read at a time.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.