简体   繁体   中英

Use Python to process images in Azure blob storage

I have 1000s of images sitting in a container on my blob storage. I want to process these images one by one in Python and spit out the new images out into a new container (the process is basically detecting and redacting objects). Downloading the images locally is not an option because they take up way too much space.

So far, I have been able to connect to the blob and have created a new container to store the processed images in, but I have no idea how to run the code to process the pictures and save them to the new container. Can anyone help with this?

Code so far is:

from azure.storage.file import FileService
from azure.storage.blob import BlockBlobService

# call blob service for the storage acct
block_blob_service = BlockBlobService(account_name = 'mycontainer', account_key = 'HJMEchn')

# create new container to store processed images
container_name = 'new_images'
block_blob_service.create_container(container_name)

Do I need to use get_blob_to_stream or get_blob_to_path from here: https://azure-storage.readthedocs.io/ref/azure.storage.blob.baseblobservice.html so I don't have to download the images?

Any help would be much appreciated!

As mentioned in the comment, you may need to download or stream your blobs and then upload the results after processing them to your new container.

You could refer to the samples to download and upload blobs as below.

Download the blobs :

# Download the blob(s).
# Add '_DOWNLOADED' as prefix to '.txt' so you can see both files in Documents.
full_path_to_file2 = os.path.join(local_path, string.replace(local_file_name ,'.txt', '_DOWNLOADED.txt'))
print("\nDownloading blob to " + full_path_to_file2)
block_blob_service.get_blob_to_path(container_name, local_file_name, full_path_to_file2)

Upload blobs to the container :

# Create a file in Documents to test the upload and download.
local_path=os.path.expanduser("~\Documents")
local_file_name ="QuickStart_" + str(uuid.uuid4()) + ".txt"
full_path_to_file =os.path.join(local_path, local_file_name)

# Write text to the file.
file = open(full_path_to_file,  'w')
file.write("Hello, World!")
file.close()

print("Temp file = " + full_path_to_file)
print("\nUploading to Blob storage as blob" + local_file_name)

# Upload the created file, use local_file_name for the blob name
block_blob_service.create_blob_from_path(container_name, local_file_name, full_path_to_file)

Update :

Try to use the code by stream as below, for more details you could see the two links: link1 and link2 (they are related issue, you could see them together).

from azure.storage.blob import BlockBlobService
from io import BytesIO
from shutil import copyfileobj 
with BytesIO() as input_blob:
    with BytesIO() as output_blob:
        block_blob_service = BlockBlobService(account_name='my_account_name', account_key='my_account_key')
        # Download as a stream
        block_blob_service.get_blob_to_stream('mycontainer', 'myinputfilename', input_blob)

        # Do whatever you want to do - here I am just copying the input stream to the output stream
        copyfileobj(input_blob, output_blob)
        ...

        # Create the a new blob
        block_blob_service.create_blob_from_stream('mycontainer', 'myoutputfilename', output_blob)

        # Or update the same blob
        block_blob_service.create_blob_from_stream('mycontainer', 'myinputfilename', output_blob)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM