简体   繁体   中英

Java heap size error while using batchDelete() on PaginatedScanList - DynamoDB

I need to delete quite a few records from dynamoDB: 删除1,500

First, I pull the records from dynamoDB to (because it's lazily-loaded). 因为它懒洋洋地加载)。

Then, I want to delete them in batches so I call batchDelete() on the list.

After about ~10minutes I'm getting OutOfMemoryError: Java heap space or GC overhead limit exceeded . I thought that it will work thanks to PaginatedScanList.

I tried to remove records with list.foreach(record -> mapper.delete(record)) and it works without error but it's too slow for my needs.

I can assign 1gb of heap space maximum and I can also invoke the requests more frequently.

Then my needs will look like this:

That's the code I have:

public void deleteOldRecords(PaginatedScanList<Records> recordsToDelete) {
 mapper.batchDelete(recordsToDelete);
}

where mapper is DynamoDBMapper . And the error looks like this:

-[pool-3-thread-1] o.s.s.s.TaskUtils$LoggingErrorHandler : Unexpected error occurred in scheduled task.
00:08:29
java.lang.OutOfMemoryError: Java heap space

Happy to see any suggestions.

Why not just use a new table? Instead of deleting from the current one?

Because there are more items in the table than just the deleted ones.

I researched a bit and found that DynamoDB TTL (Time-To-Live) feature should fit my needs perfectly without additional handling.

Thank you for your responses though.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM