简体   繁体   English

Azure 数据工厂管道复制数据错误。 将 CSV 导入 SQL 表。 已添加具有相同键的项目

[英]Azure Data Factory Pipeline Copy Data Error. Importing CSV to SQL Table. An item with the same key has already been added

Ive been working on this all day and I can get it to import a single column from the CSV but the minute there are multiple columns I get this error every time.我一整天都在研究这个,我可以让它从 CSV 导入单个列,但是当有多个列时,我每次都会收到这个错误。 I dont know if its important but the delimiter is a Pipe (|).我不知道它是否重要,但分隔符是管道 (|)。 In total there are 260 columns and 8.8 million rows I'm trying to import all of the field are nvarcher(255).总共有 260 列和 880 万行我试图导入所有字段都是 nvarcher(255)。

{
    "errorCode": "2200",
    "message": "ErrorCode=InvalidParameter,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The value of the property '' is invalid: 'An item with the same key has already been added.'.,Source=,''Type=System.ArgumentException,Message=An item with the same key has already been added.,Source=mscorlib,'",
    "failureType": "UserError",
    "target": "Copy data1",
    "details": []
}

Much thanks to anyone who can shed some light on this issue.非常感谢任何可以阐明这个问题的人。

Also found this under output:在输出下也发现了这个:

{
    "dataRead": 4194304,
    "dataWritten": 0,
    "filesRead": 1,
    "sourcePeakConnections": 10,
    "sinkPeakConnections": 2,
    "rowsRead": 0,
    "rowsCopied": 0,
    "copyDuration": 4,
    "throughput": 1024,
    "errors": [
        {
            "Code": 11402,
            "Message": "ErrorCode=InvalidParameter,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The value of the property '' is invalid: 'An item with the same key has already been added.'.,Source=,''Type=System.ArgumentException,Message=An item with the same key has already been added.,Source=mscorlib,'",
            "EventType": 0,
            "Category": 5,
            "Data": {
                "PropertyName": "UnknownParameterName",
                "InvalidParameter": "An item with the same key has already been added."
            },
            "MsgId": null,
            "ExceptionType": null,
            "Source": null,
            "StackTrace": null,
            "InnerEventInfos": []
        }
    ],
    "effectiveIntegrationRuntime": "DefaultIntegrationRuntime (West Europe)",
    "usedDataIntegrationUnits": 4,
    "billingReference": {
        "activityType": "DataMovement",
        "billableDuration": [
            {
                "meterType": "AzureIR",
                "duration": 0.06666666666666667,
                "unit": "DIUHours"
            }
        ]
    },
    "usedParallelCopies": 1,
    "executionDetails": [
        {
            "source": {
                "type": "AzureBlobStorage",
                "region": "West Europe"
            },
            "sink": {
                "type": "AzureSqlDatabase",
                "region": "West Europe"
            },
            "status": "Failed",
            "start": "2020-10-08T22:23:53.9269314Z",
            "duration": 4,
            "usedDataIntegrationUnits": 4,
            "usedParallelCopies": 1,
            "profile": {
                "queue": {
                    "status": "Completed",
                    "duration": 1
                },
                "transfer": {
                    "status": "Completed",
                    "duration": 2,
                    "details": {
                        "listingSource": {
                            "type": "AzureBlobStorage",
                            "workingDuration": 0
                        },
                        "readingFromSource": {
                            "type": "AzureBlobStorage",
                            "workingDuration": 0
                        },
                        "writingToSink": {
                            "type": "AzureSqlDatabase",
                            "workingDuration": 0
                        }
                    }
                }
            },
            "detailedDurations": {
                "queuingDuration": 1,
                "transferDuration": 2
            }
        }
    ],
    "dataConsistencyVerification": {
        "VerificationResult": "NotVerified"
    },
    "durationInQueue": {
        "integrationRuntimeQueue": 0
    }
}

So the issue was meta data.所以问题是元数据。 Even though the Copy Data had given the headers unique names on the CSV itself they weren't unique because that's how we receive the data.即使复制数据在 CSV 本身上给了标题唯一的名称,它们也不是唯一的,因为这就是我们接收数据的方式。 Just made it skip a row and that fixed it.只是让它跳过一行并修复它。 Thanks to anyone who was looking and solving this.感谢任何正在寻找和解决这个问题的人。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Azure 数据工厂 V2 复制数据问题 - 错误代码:2200 已添加具有相同密钥的项目 - Azure data factory V2 copy data issue - error code: 2200 An item with the same key has already been added 将vs项目发布到Azure时出错“已添加具有相同键的项目” - Error while publishing a vs project to Azure “An item with the same key has already been added” Azure 数据工厂 - 从 SQL 到 CRM 的管道复制 - 错误 - Azure Data Factory - Pipeline Copy from SQL to CRM - Error Windows Azure:“已添加具有相同键的项目。” Select引发异常 - Windows Azure: “An item with the same key has already been added.” exception thrown on Select 具有2个副本的Azure数据工厂管道 - Azure Data Factory Pipeline With 2 copy Azure 数据工厂 - 同一管道中的不同复制数据映射视图? - Azure Data Factory - different copy data mappings views in the same pipeline? Windows Azure发布网站 - “无法获取订阅信息。已添加具有相同密钥的项目“ - Windows Azure Publish Website- “Unable to get subscription information. An item with the same key has already been added” Azure 数据工厂复制活动失败将字符串(从 csv)映射到 Azure SQL 表接收器 uniqueidentifier 字段 - Azure Data factory copy activity failed mapping strings (from csv) to Azure SQL table sink uniqueidentifier field Azure 数据工厂:通过管道从 csv 导入到 sql 服务器期间出现错误和意外的数据类型转换错误 - Azure Data Factory: Wrong and unexpected Datatype conversion error during import from csv to sql server via pipeline 当我尝试通过 AzAD 模块创建新的客户端密码时出现“已添加具有相同密钥的项目”错误 - Getting "An item with the same key has already been added" Error when I try to create a new client secret via AzAD module
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM