简体   繁体   中英

How can i get number of folders and files or name of folders and files at each level inside a blob storage of azure

I have a storage account "STR_acc" inside it I have a blob "data_store". it contains multiple folder and subfolder with files. I need to count the files and folder present and if possible cumulative size of files. Is there any function to do that. In databricks i had dbutils though.

I did

STORAGEACCOUNTNAME = "STR_acc"
STORAGEACCOUNTKEY = "some_key"
CONTAINERNAME = "data_store"

MY_CONNECTION_STRING2 = """DefaultEndpointsProtocol=https;AccountName= ---whatever"""
blob_service_client =  BlobServiceClient.from_connection_string(MY_CONNECTION_STRING2)     

now i useed

container_client=blob_service_client.get_container_client(container_name)

but i couldnt see any useful function

One idea is if i get the folders list, i can count them using len() function but cant find any

We can get the count the files and folder present based on the following logic. But am not sure getting the size of the file or folder, that options is available with azure now.

from azure.storage.blob.blockblobservice import BlockBlobService
blob_service = BlockBlobService(account_name='storage-account-name', account_key='access-key')
containers = blob_service.list_containers()
for c in containers:
    <apply your own logic>

在此处输入图像描述

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM