简体   繁体   中英

Azure Blob Storage upload failing to upload multiple files

I am using blockBlobClient to upload multiple files. It works when I have a small array of files but not when I have a lot of files (~50-100+ files). I am assuming that it is crossing the connection limit.

if (!files.length) return []
  try {
    console.log('length of files: ', files.length)
    files.map(async file=> {
      return await uploadToBlob(file);
    })

Then I have this uploadToBlob function which uploads to blob storage

  const containerClient = await createBlobContainer(
    blobServiceClient,
    containerName
  );

  const blockBlobClient = containerClient.getBlockBlobClient(
    blobName
  );

  const uploadBlobResponse = await blockBlobClient.upload(
    content, 
    Buffer.byteLength(content)
  );

  console.log(`Upload block blob ${blobName} successfully. Request id: `, uploadBlobResponse.requestId);
  return uploadBlobResponse;
};

And the error is 500

Server encountered an internal error. Please try again after some time

const error = new RestError(

RestError: Server encountered an internal error. Please try again after some time.
RequestId:e78657e8-6ad9-452e-a7b3-6361a6c4972d
Time:2022-11-02T13:06:02.2715094Z
    at handleErrorResponse (/usr/src/app/node_modules/@azure/core-http/src/policies/deserializationPolicy.ts:274:17)
    at /usr/src/app/node_modules/@azure/core-http/src/policies/deserializationPolicy.ts:179:47
    at runMicrotasks (<anonymous>)
    at processTicksAndRejections (node:internal/process/task_queues:96:5)
    at async StorageRetryPolicy.attemptSendRequest (/usr/src/app/node_modules/@azure/storage-blob/src/policies/StorageRetryPolicy.ts:169:18)
    at async StorageClientContext.sendOperationRequest (/usr/src/app/node_modules/@azure/core-http/src/serviceClient.ts:521:23)
    at async BlockBlobClient.upload (/usr/src/app/node_modules/@azure/storage-blob/src/Clients.ts:3824:14) {
  code: 'InternalError',
  statusCode: 500,

Any help would be highly appreciated

I tried to do some kind of chunks but still the issue happens again.

  • An alternative to the use of the clientContainer we can use azcopy

  • Azcopy is a command line tool provided by azure for data transfer. You can download it here . It will provide with you with a zip file extract this file in a folder and add the address of the extracted file in path variable if you are using windows.

  • Then you can run the following command

azcopy login

This will prompt you for login. Here make sure you have data contributor role added to your account.

  • Then you can run the following command
az copy <path to you folder or file> "<url of your container>"

Refer this MS DOC on azcopy

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM