简体   繁体   English

Azure Data Factory在从SQL到ADLS的副本上抛出“需要长度”错误

[英]Azure Data Factory throws 'Length Required" error on copy from SQL to ADLS

I am trying to copy data from on-prem SQL server to Azure Data Lake Storage (ADLS) via Azure Data Factory (ADF). 我试图通过Azure数据工厂(ADF)将数据从本地SQL服务器复制到Azure数据湖存储(ADLS)。 Everything seems to work, except when I run (debug or trigger) the pipeline, I get the error: 一切似乎都有效,除非我运行(调试或触发)管道,我得到错误:

{ "errorCode": "2200", "message": "Failure happened on 'Sink' side. ErrorCode=UserErrorAdlsFileWriteFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Writing to 'AzureDataLakeStore' failed. Message: The remote server returned an error: (411) Length Required.. Response details: \\r\\nLength Required\\r\\n\\r\\n {“errorCode”:“2200”,“message”:“失败发生在'Sink'侧.ResuCode = UserErrorAdlsFileWriteFailed,'Type = Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message =写入'AzureDataLakeStore'失败。消息:远程服务器返回错误:(411)Length Required ..响应详细信息:\\ r \\ nLLength必需\\ r \\ n \\ r \\ n

Length Required 长度要求

\\r\\n \\ r \\ n

HTTP Error 411. The request must be chunked or have a content length. HTTP错误411.请求必须分块或具有内容长度。

\\r\\n\\r\\n,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (411) Length Required.,Source=System,'", "failureType": "UserError", "target": "CopyData1" } \\ r \\ n \\ r \\ n,Source = Microsoft.DataTransfer.ClientLibrary,''Type = System.Net.WebException,Message =远程服务器返回错误:(411)Length Required。,Source = System,'“, “failureType”:“UserError”,“target”:“CopyData1”}

What is really odd, is that the following pipelines DO work: 真正奇怪的是,以下管道可以工作:

  • SQL tbl1 -> SQL tbl2 SQL tbl1 - > SQL tbl2
  • ADLS source.txt -> ADLS sink.txt ADLS source.txt - > ADLS sink.txt

Ie read/write access works as expected. 即读/写访问按预期工作。 The latter pipeline is also able to create/overwrite the sink.txt file. 后一个管道也能够创建/覆盖sink.txt文件。

But when I run the pipeline 但是当我运行管道时

  • SQL tbl1 -> sink.txt SQL tbl1 - > sink.txt

I get the Length Required error. 我得到了长度要求的错误。 And if sink.txt exists, the pipeline even deletes it! 如果sink.txt存在,管道甚至会删除它!

I'm using ADFv2, ADLS Gen1, ADF & ADLS resides in the same subscription/resource group, using selfhosted/Azure Integration Runtime (for SQL / ADLS respectively). 我正在使用ADFv2,ADLS Gen1,ADF和ADLS驻留在相同的订阅/资源组中,使用自主/ Azure集成运行时(分别用于SQL / ADLS)。 I have tested with source statement as simple as "SELECT 1 Col". 我已经使用源语句进行测试,就像“SELECT 1 Col”一样简单。 Also tested without dataset schema, and with schemas+mappings. 还测试了没有数据集模式,以及模式+映射。

Is this a bug, or am I missing something? 这是一个错误,还是我错过了什么? Which “Length” is required? 需要哪个“长度”?


EDIT 1: Minimal JSON scripts 编辑1:最小的JSON脚本

pipeline1.json pipeline1.json

{
    "name": "pipeline1",
    "properties": {
        "activities": [
            {
                "name": "CopyData1",
                "type": "Copy",
                "policy": {
                    "timeout": "7.00:00:00",
                    "retry": 0,
                    "retryIntervalInSeconds": 30,
                    "secureOutput": false,
                    "secureInput": false
                },
                "typeProperties": {
                    "source": {
                        "type": "SqlSource",
                        "sqlReaderQuery": "SELECT TOP 1 'x' AS col1 FROM sys.tables"
                    },
                    "sink": {
                        "type": "AzureDataLakeStoreSink"
                    },
                    "enableStaging": false,
                    "dataIntegrationUnits": 0
                },
                "inputs": [
                    {
                        "referenceName": "table1",
                        "type": "DatasetReference"
                    }
                ],
                "outputs": [
                    {
                        "referenceName": "sink1",
                        "type": "DatasetReference"
                    }
                ]
            }
        ]
    }
}

table1.json table1.json

{
    "name": "table1",
    "properties": {
        "linkedServiceName": {
            "referenceName": "SqlServer1",
            "type": "LinkedServiceReference"
        },
        "type": "SqlServerTable",
        "typeProperties": {
            "tableName": "sys.tables"
        }
    }
}

sink1.json sink1.json

{
    "name": "sink1",
    "properties": {
        "linkedServiceName": {
            "referenceName": "AzureDataLakeStore1",
            "type": "LinkedServiceReference"
        },
        "type": "AzureDataLakeStoreFile",
        "structure": [
            {
                "name": "col1",
                "type": "String"
            }
        ],
        "typeProperties": {
            "format": {
                "type": "TextFormat",
                "columnDelimiter": ",",
                "rowDelimiter": "",
                "nullValue": "\\N",
                "treatEmptyAsNull": true,
                "skipLineCount": 0,
                "firstRowAsHeader": true
            },
            "fileName": "sink1.txt",
            "folderPath": "myDir"
        }
    }
}

EDIT 2: Summary of conducted tests 编辑2:进行的测试摘要

  • SQL -> ADLS Error SQL - > ADLS错误
  • Oracle -> ADLS Error Oracle - > ADLS错误
  • SQL -> Blob OK SQL - > Blob OK
  • Oracle -> Blob OK Oracle - > Blob OK
  • SQL -> SQL OK SQL - > SQL OK
  • ADLS -> ADLS OK ADLS - > ADLS好的
  • AzureSQLDB -> ADLS OK AzureSQLDB - > ADLS好的

Does your self-hosted IR has some proxy setting or goes through special network setting? 您的自托管IR是否具有某些代理设置或通过特殊的网络设置? Such error should be caused by the intermediate proxy service when ADF's ADLS connector tried to talk to the ADLS service. 当ADF的ADLS连接器尝试与ADLS服务通信时,此类错误应由中间代理服务引起。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Migrating Data from a SQL Server Encrypted Table to SQL Azure using Azure Data Factory Copy data - Migrating Data from a SQL Server Encrypted Table to SQL Azure using Azure Data Factory Copy data Azure 数据工厂管道 - 从本地 SQL Server 和 Dynamics CRM 复制数据 - Azure Data Factory pipeline - copy data from on-premises SQL Server and Dynamics CRM Azure 数据工厂无法执行复制数据任务到 SQL 服务器 - Azure Data Factory fails to execute copy data task to SQL Server 将数据从 ADLS Gen 2 加载到 Azure Synapse - Loading data from ADLS Gen 2 into Azure Synapse 使用数据工厂将嵌套对象从 SQL Server 复制到 Azure CosmosDB - Copy nested objects from SQL Server to Azure CosmosDB using a Data Factory Azure 数据工厂复制嵌套的 JSON 到 SQL 表 - Azure Data Factory copy nested JSON to SQL table 无法使用 ADF 将数据从 ADLS gen2 复制到 SQL 服务器 - Not able copy data from ADLS gen2 to SQL Server using ADF Azure 数据工厂 - 从 Blob 批量导入 Azure SQL - Azure Data Factory - Bulk Import from Blob to Azure SQL Azure 数据工厂中的 80 个 Postgres 数据库复制到 - 80 Postgres databases in Azure Data Factory to copy into 有没有办法从 Azure 数据工厂内部编写 SQL 查询? - Is there a way to author a SQL query from inside Azure Data Factory?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM