简体   繁体   English

如何保护重复备份

[英]How to protect Duplicity backups

I use Duplicity for backing up my hosting account to remote server using WebDav. 我使用Duplicity通过WebDav将托管帐户备份到远程服务器。 Schedule is daily incremental, monthly full. 日程安排是每日增量,每月已满。

I want to also protect backups against hosting hack, so I have to be sure that server (where is Duplicity) can not destrol backups on remote server. 我还想保护备份免受托管黑客攻击,因此我必须确保服务器(Duplicity在哪里)不能破坏远程服务器上的备份。

Is there recommended solution for protecting backups? 是否有推荐的解决方案来保护备份?

If not I thinked up about to make script on remote server, what will make backups read-only after they were uploaded. 如果没有,我想在远程服务器上制作脚本,那么备份上传后将使备份成为只读的。 (And eventually it also delete backups older than x months.) (最终,它还会删除早于x个月的备份。)

I can make this script, but I am not sure what files can be protected safely. 我可以制作此脚本,但是不确定哪些文件可以安全地保护。 If I chmod ow all files periodically, will backups continue next day? 如果我定期chmod ow所有文件,第二天将继续备份吗? Or Duplicity needs to write to yet uploaded files? 还是Duplicity需要写入尚未上传的文件? How to determine what files will Duplicity need to change and what not? 如何确定Duplicity需要更改哪些文件,什么不需要?

How can I delete old backups and not break something? 如何删除旧备份而不破坏某些内容?

I use Duplicity for backing up my hosting account to remote server using WebDav. 我使用Duplicity通过WebDav将托管帐户备份到远程服务器。 Schedule is daily incremental, monthly full. 日程安排是每日增量,每月已满。

I want to also protect backups against hosting hack, so I have to be sure that server (where is Duplicity) can not destrol backups on remote server. 我还想保护备份免受托管黑客攻击,因此我必须确保服务器(Duplicity在哪里)不能破坏远程服务器上的备份。

That is not what duplicity is designed for. 那不是为重复而设计的。 Its key feature is encryption to protect your backups on possibly insecure backends. 它的关键功能是加密,以保护可能不安全的后端上的备份。

If you machine is hacked, your main problem is probably not backup destruction but silently backing up malicious code uploaded by the attacker. 如果您的计算机被黑客入侵,那么您的主要问题可能不是备份破坏,而是静默备份攻击者上传的恶意代码。

Is there recommended solution for protecting backups? 是否有推荐的解决方案来保护备份?

Not to my knowledge. 据我所知。 A second repository where you rsync to using --link-dest or dirvish to achieve a snapshot style backup of your backups. 第二个存储库,您可以在其中同步使用--link-dest或dirvish来实现备份的快照样式备份。 This way an attacker could modify/corrupt your old backups but you'd still have the proper files. 这样,攻击者可能会修改/损坏您的旧备份,但您仍然拥有正确的文件。 But the issue then would still be to find out from which point in time your backups start to be soiled. 但是问题仍然是找出备份从哪个时间点开始受到污染。

If not I thinked up about to make script on remote server, what will make backups read-only after they were uploaded. 如果没有,我想在远程服务器上制作脚本,那么备份上传后将使备份成为只读的。 (And eventually it also delete backups older than x months.) (最终,它还会删除早于x个月的备份。)

should work as long as the last duplicity run was successful. 只要最后一次重复运行成功,它就应该起作用。 the only time duplicity overwrites something on the backend is when it resumes an interrupted backup. 重复性唯一会在后端覆盖某些内容的时间是当它恢复中断的备份时。

I can make this script, but I am not sure what files can be protected safely. 我可以制作此脚本,但是不确定哪些文件可以安全地保护。 If I chmod ow all files periodically, will backups continue next day? 如果我定期更改所有文件,第二天将继续备份吗? Or Duplicity needs to write to yet uploaded files? 还是Duplicity需要写入尚未上传的文件? How to determine what files will Duplicity need to change and what not? 如何确定Duplicity需要更改哪些文件,什么不需要?

See my previous answer. 看到我以前的答案。

How can I delete old backups and not break something? 如何删除旧备份而不破坏某些内容?

Use duplicity's purge commands. 使用双重性的清除命令。 You could run it on your webdav machine as a user that still has write access to the repo. 您可以以仍然对存储库具有写访问权的用户身份在webdav机器上运行它。

Have fun.. ede/duply.net 玩得开心.. ede / duply.net

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM