简体   繁体   English

Data Lake Gen 2 的 Azure Blob 触发器函数

[英]Azure Blob Trigger Function for Data Lake Gen 2

What i am trying to do?我正在尝试做什么? I have a data lake container.我有一个数据湖容器。 inside HDFS name spaces ex: "container/year/month/day/bunch of files".在 HDFS 名称空间内,例如:“容器/年/月/日/文件束”。 files will upload on daily bases and folder structure is dynamic based on current date .文件将每天上传,文件夹结构基于当前日期是动态的。 i need my azure function to trigger when files are uploaded in day directory.当文件上传到 day 目录时,我需要我的 azure 函数来触发。 and those files will process and dump data to sql server db[c# code].这些文件将处理并将数据转储到 sql server db[c# code]。 Only i have problem is triggering my function over dynamic directory.只有我有问题是通过动态目录触发我的功能。 please help me or suggest me on how to approach.请帮助我或建议我如何处理。

Thanks a million.太感谢了。

You don't need to use dynamic foldername.您不需要使用动态文件夹名称。 Actually, the path of the blobtrigger need to be given when compiled.实际上,编译时需要给出blobtrigger的路径。 You should give a const or set it in the environment variable.您应该提供一个 const 或将其设置在环境变量中。

So, two ways:所以,两种方式:

1, The first way is simply. 1、第一种方式很简单。 Just do like this:只是这样做:

using System;
using System.IO;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using Microsoft.Extensions.Logging;

namespace FunctionApp23
{
    public static class Function1
    {
        [FunctionName("Function1")]
        public static void Run([BlobTrigger("yourcontainername/{year}/{month}/{day}/{filename}", Connection = "str")]Stream myBlob, string filename, ILogger log)
        {
            log.LogInformation($"C# Blob trigger function Processed blob\n Name:{filename} \n Size: {myBlob.Length} Bytes");
        }
    }
}

2, Second way, Deploy a timetrigger with your blobtrigger. 2、第二种方式,用你的blobtrigger部署一个timetrigger。 And put the code that can add the environment variable in it.(This timetrigger triggers once a day.)并且把可以添加环境变量的代码放进去。(这个timetrigger每天触发一次。)

I don't recommend this method, Although it can achieve the "dynamic", but I think your use case does not need this.我不推荐这种方法,虽然它可以实现“动态”,但我认为你的用例不需要这个。 If you really need this, I will update the code.如果你真的需要这个,我会更新代码。 But in theory the first method is sufficient.但理论上第一种方法就足够了。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 从 azure 函数连接到 Azure 数据湖 Gen 2 - Connection to Azure data lake Gen 2 from azure function 使用服务主体从 Azure Function 连接到 Data Lake Gen 2 会引发 AuthorizationPermissionMismatch 错误 - Connecting to Data Lake Gen 2 from Azure Function using Service Principal is throwing AuthorizationPermissionMismatch error Azure Data Lake Gen 2 - 如何选择加入“Azure Data Lake Storage 上的多协议访问” - Azure Data Lake Gen 2 - How to opt in to “Multi-protocol access on Azure Data Lake Storage” 如何将Azure Blob文件复制到Azure Data Lake Analytics - How to copy azure blob files to azure data lake analytics 通过 Azure 函数中的 C# 将文件从一个 DataLake Gen2 复制到另一个 Data Lake Gen 2 - Copy file from one DataLake Gen2 to another Data Lake Gen 2 via C# in Azure Functions 尝试从 Data Lake gen2 读取 blob 时,此请求无权执行此操作 - This request is not authorized to perform this operation while trying to read a blob from Data Lake gen2 使用Azure Functions调用REST API并将结果保存在Azure Data Lake gen2中 - Using Azure Functions to call REST API and save results in Azure Data Lake gen2 如何创建文件或将文件上传到Azure Data Lake Storage Gen2 - How to create a file or upload a file to Azure Data Lake Storage Gen2 Read CSV From Azure Data lake storage Gen 1 in c# .net API - Read CSV From Azure Data lake storage Gen 1 in c# .net API Azure功能blob输入与servicebus触发器绑定 - Azure Function blob input binding with a servicebus trigger
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM