问题描述
我目前正在使用S3来存储大量帐户级别的数据,例如图像,文本文件和用户在我的应用程序中上载的其他形式的持久性内容.我希望对此数据进行增量快照(每周一次),并将其发送到另一个S3存储桶.我想这样做是为了防止意外的数据丢失,即我们的一位工程师意外地在S3浏览器中删除了一部分数据.
I am currently using S3 to store large quantities of account level data such as images, text files and other forms of durable content that users upload in my application. I am looking to take an incremental snapshot of this data (once per week) and ship it off to another S3 bucket. I'd like to do this in order to protect against accidentally data loss, i.e. one of our engineers accidentally deleting a chunk of data in the S3 browser.
任何人都可以建议一些实现此目标的方法吗?我们是否需要在EC2实例上托管自己的备份应用程序?是否有一个应用程序可以立即解决这个问题?数据可以进入S3 Glacier,并且不需要随时访问,它比其他任何东西都更像是一种保险单.
Can anyone suggest some methodology for achieving this? Would we need to host our own backup application on an EC2 instance? Is there an application that will handle this out of the box? The data can go into S3 Glacier and doesn't need to be readily accessible, it's more of an insurance policy than anything else.
编辑1
我相信启用版本控制可能是答案(继续对此进行研究): http://docs.amazonwebservices.com/AmazonS3/latest/dev/Versioning.html
I believe switching on versioning maybe the answer (continuing to research this):http://docs.amazonwebservices.com/AmazonS3/latest/dev/Versioning.html
编辑2
对于寻求该问题答案的其他人,ServerFault上有一个不错的线程.后来我才发现:
For others looking for answers to this question, there a good thread on ServerFault. I only came across this later:
> https://serverfault.com/questions/9171/aws-s3-bucket-backups
推荐答案
在存储桶中启用 versioning 是正确的解决方案.它也可以用于防止意外删除和覆盖.
Enabling versioning on your bucket is the right solution. It can be used to protect both against accidental deletes and overwrites as well.
S3 FAQ的数据保护"下有一个问题,恰好讨论了此问题(意外删除/覆盖): http://aws.amazon.com/s3/faqs/#Why_should_I_use_Versioning
There's a question on the S3 FAQ, under "Data Protection", that discusses exactly this issue (accidental deletes/overwrites): http://aws.amazon.com/s3/faqs/#Why_should_I_use_Versioning
这篇关于S3增量备份的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!