简体   繁体   中英

How to transfer files from local machine to virtual folders within an Azure container (using blobxfer)

I am using blobxfer to upload some files to Azure Storage. It's like AzCopy, but for Linux OS (whereas AzCopy is for Windows).

Basically, I have two folders at the location /home/myuser/s3/ . One folder is called /uploads/ , the other is called /users/ . These folders contain multitudes of image files. I need to upload both these folders to an Azure container.

Keep in mind that I don't want the contents of the folders to mix on Azure storage, they need to be transferred to separate virtual folders in my Azure container. For example, I have a container called pictures , I want the contents of /s3/uploads/ (local folder) to reside within the virtual folder /uploads/ in the pictures container. Likewise for /s3/users/ .

These virtual folders already exist in my container. The command I tried from the commandline was: blobxfer mystorageaccount pictures /home/myuser/s3 --upload --storageaccountkey=<primary access key from portal.azure.com> --no-container

This fails with Unknown error (The value for one of the HTTP headers is not in the correct format.) . The full traceback being:

<?xml version="1.0" encoding="utf-8"?><Error><Code>InvalidHeaderValue</Code><Message>The value for one of the HTTP headers is not in the correct format.
RequestId:61a1486c-0101-00d6-13b5-408578134000
    Time:2015-12-27T12:56:03.5390180Z</Message><HeaderName>x-ms-blob-content-length</HeaderName><HeaderValue>0</HeaderValue></Error>
Exception in thread Thread-49 (most likely raised during interpreter shutdown):
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 810, in __bootstrap_inner
  File "/home/myuser/.virtualenvs/redditpk/local/lib/python2.7/site-packages/blobxfer.py", line 506, in run
  File "/home/myuser/.virtualenvs/redditpk/local/lib/python2.7/site-packages/blobxfer.py", line 597, in putblobdata
  File "/home/myuser/.virtualenvs/redditpk/local/lib/python2.7/site-packages/blobxfer.py", line 652, in azure_request
<type 'exceptions.AttributeError'>: 'NoneType' object has no attribute 'Timeout'

What am I doing wrong, and how can I fix this?

Looking at the error, it is clear that the script is trying to create a container ( pictures in your case) however the container already exists.

I quickly looked up the code on Github and it seems there's an argument called createcontainer . Default value for this argument is True . Try passing this argument in your script and initialize its value to False . Then the script shouldn't try to create the container.

As stated in your other question the immediate issue is with zero-byte files. A few of the bugs surrounding these files have been fixed in the recent release of blobxfer 0.9.9.6. Please use the GitHub issue page for the project if the recent release does not address the error you are seeing.

The other issue is that you would like to upload subdirectories within a parent directory to the same container. If the two directories are the only directories in the parent directory, then you can use one invocation of blobxfer to achieve what you want. Otherwise, you will need two separate invocations of blobxfer (one for each subdirectory). In either case because you are using an absolute path, you will need to specify the --strip-components parameter (similar to tar parameter of the same name) so they will be nicely nested under the container with the second-level directories that you desire. If you use a relative base path then you will not need the --strip-components parameter, eg, invoking blobxfer from within /home/myuser and specifying your local resource as s3 .

Regarding the container already exists issue, there should not be an issue if the container already exists. It will ignore such a failure.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM