[英]Copy of huge data from one collection to another collection in Mongodb using spring boot
We are writing a scheduler to take backup of data from one collection to another collection in Mongodb using spring boot.我们正在编写一个调度程序,以使用 Spring Boot 将数据从一个集合备份到 Mongodb 中的另一个集合。 The data can be 500K to 1Million docs.
数据可以是 500K 到 100 万个文档。 Once the copy was completed we should delete the data from old collection.
复制完成后,我们应该从旧集合中删除数据。 Currently we are using spring data pagination to get the chunks of data and saving to new collection and then deleting.
目前我们正在使用 spring 数据分页来获取数据块并保存到新集合然后删除。
Is this approach fine or any optimistic approach is suggestible.这种方法是否可行,或者任何乐观的方法都值得推荐。
As you using spring data with pagination for this task, it means you utilizing container for process documents.当您为此任务使用带有分页的 spring 数据时,这意味着您将容器用于流程文档。
You can trigger set of system command(mongo export and after import delete data from source) from scheduler.您可以从调度程序触发一组系统命令(mongo 导出和导入后从源中删除数据)。
Example like..例如..
SystemCommandTasklet tasklet = new SystemCommandTasklet();
SystemCommandTasklet tasklet = new SystemCommandTasklet();
tasklet.setCommand("");
tasklet.setCommand("");
tasklet.setWorkingDirectory("/home/merlin");tasklet.setWorkingDirectory("/home/merlin");
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.