I would like to copy a blob from one Datalake to another one, here is my simple code to do this job:
DataLakeFileSystemClient sourceDataLakeFileSystemClient = StorageAccountDataLakeHelper.GetDataLakeFileSystemClient(SourceContainer, SOURCE_DATALAKE_NAME, SOURCE_DATA_LAKE_ACCESS_KEY);
DataLakeFileSystemClient taregetDataLakeFileSystemClient = StorageAccountDataLakeHelper.GetDataLakeFileSystemClient(TargetContainer, TARGET_DATALAKE_NAME, TARTGET_DATA_LAKE_ACCESS_KEY);
DataLakeDirectoryClient sourcedirectoryClient = sourceDataLakeFileSystemClient.GetDirectoryClient("folder1/folder2/");
DataLakeFileClient sourcefileClient = sourcedirectoryClient.GetFileClient("myfile.csv.csv");
Stream reader= await sourcefileClient.OpenReadAsync();
DataLakeDirectoryClient targetdirectoryClient = taregetDataLakeFileSystemClient.GetDirectoryClient("folder1/folder2/");
DataLakeFileClient targetfileClient = await targetdirectoryClient.CreateFileAsync("myfile.csv.csv");
The authentication based on access key as you can see. I can run this code localy successfully. but if I publishe to a function app in azure I get this exception:
An unhandled exception of type 'Azure.RequestFailedException' occurred in System.Private.CoreLib.dll
Service request failed.
Status: 403 (This request is not authorized to perform this operation.)
ErrorCode: AuthorizationFailure
Headers:
Transfer-Encoding: chunked
Server: Microsoft-HTTPAPI/2.0
x-ms-request-id: 0d261530-201e-009b-5a39-3c8d7e000000
x-ms-client-request-id: 13ec55f9-7a99-4959-bb1c-024e5848a414
x-ms-error-code: AuthorizationFailure
Date: Wed, 28 Apr 2021 14:20:59 GMT
Exception occures on this line:
Stream reader= await sourcefileClient.OpenReadAsync();
What should I do to solve this problem?
I use below code and deploy to azure, it seems work fine:
using System;
using System.IO;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Mvc;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Extensions.Http;
using Microsoft.AspNetCore.Http;
using Microsoft.Extensions.Logging;
using Newtonsoft.Json;
using Azure.Storage.Files.DataLake;
namespace FunctionApp110
{
public static class Function1
{
[FunctionName("Function1")]
public static async Task<IActionResult> Run(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = null)] HttpRequest req,
ILogger log)
{
string str = "DefaultEndpointsProtocol=https;AccountName=0730bowmanwindow;AccountKey=xxxxxx;EndpointSuffix=core.windows.net";
string sourcefs = "source";
string sinkfs = "sink";
string sourcedir = "test";
string sinkdir = "test";
string sourcefile = "source.txt";
string sinkfile = "sink.txt";
DataLakeFileSystemClient sourcefilesystemclient = new DataLakeFileSystemClient(str, sourcefs);
DataLakeFileSystemClient sinkfilesystemclient = new DataLakeFileSystemClient(str, sinkfs);
var sourcedirectoryclient = sourcefilesystemclient.GetDirectoryClient(sourcedir);
var sinkdirectoryclient = sinkfilesystemclient.GetDirectoryClient(sinkdir);
var sourcefileclient = sourcedirectoryclient.GetFileClient(sourcefile);
Stream reader = sourcefileclient.OpenRead();
MemoryStream msreader = new MemoryStream();
reader.CopyTo(msreader);
msreader.Position = 0;
DataLakeFileClient sinkfileclient = sinkdirectoryclient.CreateFile(sinkfile);
sinkfileclient.Append(msreader, 0);
sinkfileclient.Flush(sourcefileclient.GetProperties().Value.ContentLength);
return new OkObjectResult("This is a test.");
}
}
}
You can try it.:)
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.