简体   繁体   中英

Azure Table Storage Backup

In my azure subscription I have a storage account with a lot of tables that contains important data. As far as I know azure offers a backup point-in-time for the storages and blobs, and geo redundancy in event of a failover. But I couldn't find anything regarding the backup of table storages. The only way to do so is by using azCopy which is fine and a logic, but I couldn't make it work as I had some issues with permissions even if I set the Azure Blob Data Contributor to my container.

So as an option, I was thinking if there is a way how to implement this using python code to loop throu all the tables in a specific container and make a copy into another container.

Can anyone enlighten me on this matter please?

Did you set the Azure Storage firewall: allow access from all networks?: 在此处输入图像描述

Python code is a way but we can't help you design the code. And there isn't an example for you. It doesn't meet Stack Overflow's guideline.

If you still couldn't figure it out with AzCopy , I would suggest you think about use Data Factory to schedule backup the data from table storage to another container.

  1. Create a pipeline with copy active to copy the data from Table Storage. Ref this tutorial: Copy data to and from Azure Table storage by using Azure Data Factory .
  2. Create a schedule trigger for the pipeline to make the jobs automatic.

If the Table storage has many tables, the easiest way is using Copy Data Tool .

Update:

Copy data tool source settings:

在此处输入图像描述

Sink settings: auto create the table in sink table storage

在此处输入图像描述

HTH.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM