简体   繁体   中英

How to make backup of Azure table storage

I have a requirement where I need to migrate data from one azure table storage , basically from one table to other table (Both table could be either in same subscription or different subscription).

Is there any way in Azure table storage to do above requirement like in SQL storage where user can generate scripts or backup of entire database or individual table.

Short answer: there's no built-in backup. Longer answer:

  • There's no built-in backup service for table storage, or "snapshot" feature (which blobs have): You'll need to do your own copy operation, from one table to another. There's a REST API available along with SDKs, PowerShell cmdlets, CLI commands, and AzCopy (all built upon the API). There are also a bunch of 3rd-party tools you can search for. How you accomplish your table-copying is up to you.

  • Table storage is durable storage - triple-replicated within a region (and optionally geo-replicated to another region). Even if storage became unavailable in the primary region, you'd have the option of reading from the paired region (assuming you enabled your storage account to be geo-redundant). Note: This is not the same as backup - if you delete an entity in a table, that deletion is replicated everywhere

  • Storage-copying (eg copying entities out of a table, to another table) will be the same, regardless of subscription. Storage accounts are keyed on account namespace + access key (and optionally SAS).

I realize this is an old question, but I thought I'd leave this response for those who are still looking for ways to back up Azure table data in 2018.

I've seen a lot of suggestions for using AzCopy, which looks like a great way to do it.

However, if using C# works better for you, I wrote a tool (that my workplace allowed me to open source) which is on github: https://github.com/Watts-Energy/Watts.Azure#azure-data-factory

The main objective of the project is not backups, but it can be used to do just that, and we have backups running in Azure Web Jobs using the functionality therein. We open sourced it because we figured it could prove useful to others, besides us, since it allows you to do 'incremental' backups, which I don't know if you can accomplish with AzCopy. I'm not saying you can't, only that I haven't a clue whether that's possible.

The idea is that you create a small console application (to be hosted as an Azure Web Job for example) in .NET and you can eg do something like this:

DataCopyBuilder
.InDataFactoryEnvironment(environment)
.UsingDataFactorySetup(environment.DataFactorySetup)
.UsingDefaultCopySetup()
.WithTimeoutInMinutes(numberOfMinutes)
.AuthenticateUsing(authentication)
.CopyFromTable(sourceTable)
.WithSourceQuery(null)
.ToTable(targetTable)
.ReportProgressToConsole()
.StartCopy();

If you, when the job runs, store the time (in UTC) when you started your last copy operation, you can supply a 'source query' (example: ' Timestamp gt datetime'{lastBackupStarted.ToIso8601()} ') rather than null like in the example above, and it will only take data modified since that date. It's explained in greater detail in the project README.txt .

Again, not sure if it's useful to anyone, but it does solve some challenges we have had.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM