简体   繁体   English

Azure 数据工厂。 无法通过 HTTP 连接器加载文件

[英]Azure Data Factory. Unable to load files via HTTP Connector

To load files from a remote server to an Azure BLOB Storage, I am working on this tutorial from Azure team: Copy data from an HTTP endpoint by using Azure Data Factory or Azure Synapse Analytics . To load files from a remote server to an Azure BLOB Storage, I am working on this tutorial from Azure team: Copy data from an HTTP endpoint by using Azure Data Factory or Azure Synapse Analytics .

I need to load publicly available files from this site through this API: Via API https://api.usaspending.gov/api/v2/download/disaster/我需要通过此 API 从该站点加载公开可用的文件:通过 API https://api.usaspending.gov/api/v2/download/disaster/

The tutorial for using the API (and other API from the above linked site) is provided here .此处提供了使用 API(以及上述链接站点中的其他 API)的教程

But when I run the pipeline, I get the following error:但是当我运行管道时,出现以下错误:

ErrorCode=HttpFileFailedToRead,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Failed to read data from http server. ErrorCode=HttpFileFailedToRead,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=无法从 http 服务器读取数据。 Check the error from http server:The remote server returned an error: (405) Method Not Allowed.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (405) Method Not Allowed.,Source=System,'从http服务器查看错误:远程服务器返回错误:(405) Method Not Allowed.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=远程服务器返回错误:( 405) 方法不允许。,来源=系统,'

Question : How can we use the the above API Endpoint in Azure Data Factory to load publicly available files from their remote server?问题:我们如何使用 Azure 数据工厂中的上述 API 端点从他们的远程服务器加载公开可用的文件?

Remarks : According to their site : /api/v2/download/disaster/ POST Returns a zipped file containing Account and Award data for the Disaster Funding备注:根据他们的网站/api/v2/download/disaster/ POST Returns a zipped file containing Account and Award data for the Disaster Funding

Our configuration of the Source of ADF Pipeline :我们对 ADF Pipeline 的Source的配置

在此处输入图像描述

The above source tests successfully as shown below :以上源码测试成功,如下图

在此处输入图像描述

The destination (an Azure Data Lake Storage) also tests successfully as shown below :目的地(一个 Azure 数据湖存储)也测试成功,如下所示

在此处输入图像描述

Check the METHOD which you are using in your copy activity.检查您在复制活动中使用的方法。 It needs to be POST.它需要是 POST。

在此处输入图像描述

Also, the response is not a file but provides a link to where the file can be downloaded.此外,响应不是文件,而是提供了可以下载文件的链接。 See below for the sample response.请参阅下面的示例响应。 This means you need another activity to actually fetch the file.这意味着您需要另一个活动来实际获取文件。

在此处输入图像描述

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Azure数据工厂HTTP连接器以解析网页 - Azure Data Factory HTTP connector to parse a webpage Azure数据工厂。 使用许多不同的git分支 - Azure data factory. Use many different git branches Azure 数据工厂。 使用存储源目标和查询的控制表将数据从源加载到目标 - Azure Data Factory. Loading data from Source to Target using a control table storing source target and query 用于在azure数据工厂的Rest API连接器中配置分页规则以获取数据并加载到sql数据库的值 - Values to configure pagination rules in Rest API connector in azure data factory to get data and load into sql database 使用 azure 数据工厂从 http 站点解压多个文件 - using azure data factory to unzip multiple files from http site Azure 数据工厂 - CRM (OData) 连接器 - Azure Data Factory - CRM (OData) Connector 如何通过在 azure 数据工厂中查找的 sap 表连接器获取最大数据? - How to get max data via sap table connector look up in azure data factory? Azure 数据工厂的逻辑应用连接器未显示现有工厂管道 - Logic App connector for Azure Data Factory not showing existing factory pipeline 通过 Azure 数据工厂 V2 将 XML 数据加载到 SQL 表(Azure)中 - Load XML Data Into SQL Table (Azure) via Azure Data Factory V2 将 XML 压缩文件从 HTTP 链接源复制并提取到 Azure 使用 Z3A5805BC0FZ63F 工厂数据存储的 Blob 存储 - Copy and Extracting Zipped XML files from HTTP Link Source to Azure Blob Storage using Azure Data Factory
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM