简体   繁体   English

使用 Azure 数据工厂将数据从 Azure Data Lake 复制到 SnowFlake,无需阶段

[英]Copy Data from Azure Data Lake to SnowFlake without stage using Azure Data Factory

All the Azure Data Factory examples of copying data from Azure Data Lake Gen 2 to SnowFlake use a storage account as stage.将数据从 Azure Data Lake Gen 2 复制到 SnowFlake 的所有 Azure 数据工厂示例都使用存储帐户作为阶段。 If the stage is not configured (as shown in picture), I get this error in Data Factory even when my source is a csv file in Azure data lake - "Direct copying data to Snowflake is only supported when source dataset is DelimitedText, Parquet, JSON with Azure Blob Storage or Amazon S3 linked service, for other dataset or linked service, please enable staging".如果未配置阶段(如图所示),即使我的源是 Azure 数据湖中的 csv 文件,我也会在数据工厂中收到此错误 - “仅当源数据集为 DelimitedText、Parquet 时才支持将数据直接复制到 Snowflake, JSON 与 Azure Blob Storage 或 Amazon S3 链接服务,对于其他数据集或链接服务,请启用暂存”。 在此处输入图片说明

At the same time, SnowFlake documentation says the the external stage is optional.同时,SnowFlake 文档说外部阶段是可选的。 How can I copy data from Azure Data Lake to SnowFlake using Data Factory's Copy Data Activity without having an external storage account as stage?如何在没有外部存储帐户作为阶段的情况下使用数据工厂的复制数据活动将数据从 Azure Data Lake 复制到 SnowFlake? If staging storage is needed to make it work, we shouldn't say that data copy from Data Lake to SnowFlake is supported.如果需要临时存储来使其工作,我们不应该说支持从 Data Lake 到 SnowFlake 的数据复制。 It works only when, Data Lake data is is first copied in a storage blob and then to SnowFlake.仅当首先将 Data Lake 数据复制到存储 blob 中,然后复制到 SnowFlake 时,它​​才有效。

You'll have to configure blob storage and use it as staging.您必须配置 blob 存储并将其用作暂存。 As an alternative you can use external stage.作为替代方案,您可以使用外部舞台。 You'll have to create a FILE TYPE and NOTIFICATION INTEGRATION and access the ADLS and load data into Snowflake using copy command.您必须创建文件类型和通知集成并访问 ADLS 并使用复制命令将数据加载到雪花中。 Let me know if you need more help on this.如果您需要更多帮助,请告诉我。

Though Snowflake supports blob storage, Data Lake storage Gen2, General purpose v1 & v2 storages, loading data into snowflake is supported- through blob storage only .虽然 Snowflake 支持 blob 存储、Data Lake 存储 Gen2、通用 v1 和 v2 存储,但 支持将数据加载到雪花中 - 仅通过 blob 存储

The source linked service is Azure Blob storage with shared access signature authentication.源链接服务是具有共享访问签名身份验证的 Azure Blob 存储。 If you want to directly copy data from Azure Data Lake Storage Gen2 in the following supported format, you can create an Azure Blob linked service with SAS authentication against your ADLS Gen2 account , to avoid using staged copy to Snowflake.如果要以以下支持的格式直接从 Azure Data Lake Storage Gen2 复制数据,可以针对 ADLS Gen2 帐户创建具有 SAS 身份验证的 Azure Blob 链接服务,以避免使用分阶段复制到 Snowflake。

Select Azure blob storage in linked service, provide SAS URI details of Azure data lake gen2 source file.在链接服务中选择 Azure blob 存储,提供Azure data lake gen2源文件的SAS URI详细信息。

Blob storage linked service with data lake gen2 file: Blob 存储链接服务与数据湖 gen2 文件:

在此处输入图片说明

在此处输入图片说明

在此处输入图片说明

在此处输入图片说明

在此处输入图片说明

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 无法使用数据工厂管道将数据从 azure 数据湖 gen2 复制到 azure sql db - Cannot copy data from azure data lake gen2 to azure sql db using data factory pipeline 使用数据工厂创建一个管道,将活动从天蓝色的Blob存储复制到数据湖存储 - create a pipeline using data factory with copy activity from azure blob storage to data lake store 使用Azure数据工厂(ADF)仅从Azure Data Lake存储中复制最新文件 - Copy only the latest file from azure data lake store with Azure Data Factory (ADF) 从 Azure 数据工厂将数据摄取到雪花 - Data ingestion to snowflake from Azure data factory 使用Azure Data Factory将HTTP端点中的数据加载到Azure Data Lake中 - Data from HTTP endpoint to be loaded into Azure Data Lake using Azure Data Factory 使用Azure Data Factory将数据从Data Lake Store(JSON文件)移动到Azure搜索 - Move data from Data Lake Store (JSON file ) to Azure Search using Azure Data Factory 使用Azure Data Factory将数据从SAP BW复制到Azure Data Lake Store - Copying Data from SAP BW to Azure Data Lake Store using Azure Data Factory 如何使用 Azure 数据工厂将新文件或更新文件从 Azure 数据湖推送到文件夹 - How to push new or updated files from Azure Data lake to File folder using Azure Data Factory Azure 数据工厂:数据湖访问权限 - Azure Data Factory: Data Lake Access Permissions 使用数据工厂附加到 azure 数据湖中的文件 - appending to a file in azure data lake using data factory
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM