简体   繁体   English

Azure数据工厂复制活动

[英]Azure Data Factory Copy Activity

I have been working on this for a couple days and cannot get past this error. 我已经为此工作了几天,无法克服此错误。 I have 2 activities in this pipeline. 我在此管道中有2个活动。 The first activity copies data from an ODBC connection to an Azure database, which is successful. 第一个活动成功将数据从ODBC连接复制到Azure数据库。 The 2nd activity transfers the data from Azure table to another Azure table and keeps failing. 第二个活动将数据从Azure表传输到另一个Azure表,并保持失败。

The error message is: Copy activity met invalid parameters: 'UnknownParameterName', Detailed message: An item with the same key has already been added.. 错误消息是: 复制活动遇到无效的参数:'UnknownParameterName',详细消息:具有相同键的项已被添加。

I do not see any invalid parameters or unknown parameter names. 我看不到任何无效的参数或未知的参数名称。 I have rewritten this multiple times using their add activity code template and by myself, but do not receive any errors when deploying on when it is running. 我已经使用他们的添加活动代码模板和我自己多次重写了此代码,但是在运行时进行部署时不会收到任何错误。 Below is the JSON pipeline code. 以下是JSON管道代码。

Only the 2nd activity is receiving an error. 仅第二活动收到错误。

Thanks. 谢谢。

Source Data set 源数据集

{
"name": "AnalyticsDB-SHIPUPS_06shp-01src_AZ-915PM",
"properties": {
    "structure": [
        {
            "name": "UPSD_BOL",
            "type": "String"
        },
        {
            "name": "UPSD_ORDN",
            "type": "String"
        }
    ],
    "published": false,
    "type": "AzureSqlTable",
    "linkedServiceName": "Source-SQLAzure",
    "typeProperties": {},
    "availability": {
        "frequency": "Day",
        "interval": 1,
        "offset": "04:15:00"
    },
    "external": true,
    "policy": {}
}

} }

Destination Data set 目标数据集

{
"name": "AnalyticsDB-SHIPUPS_06shp-02dst_AZ-915PM",
"properties": {
    "structure": [
        {
            "name": "SHIP_SYS_TRACK_NUM",
            "type": "String"
        },
        {
            "name": "SHIP_TRACK_NUM",
            "type": "String"
        }
    ],
    "published": false,
    "type": "AzureSqlTable",
    "linkedServiceName": "Destination-Azure-AnalyticsDB",
    "typeProperties": {
        "tableName": "[olcm].[SHIP_Tracking]"
    },
    "availability": {
        "frequency": "Day",
        "interval": 1,
        "offset": "04:15:00"
    },
    "external": false,
    "policy": {}
}

} }

Pipeline 管道

{
"name": "SHIPUPS_FC_COPY-915PM",
"properties": {
    "description": "copy shipments ",
    "activities": [
        {
            "type": "Copy",
            "typeProperties": {
                "source": {
                    "type": "RelationalSource",
                    "query": "$$Text.Format('SELECT COMPANY, UPSD_ORDN, UPSD_BOL FROM \"orupsd - UPS interface Dtl\" WHERE COMPANY = \\'01\\'', WindowStart, WindowEnd)"
                },
                "sink": {
                    "type": "SqlSink",
                    "sqlWriterCleanupScript": "$$Text.Format('delete imp_fc.SHIP_UPS_IntDtl_Tracking', WindowStart, WindowEnd)",
                    "writeBatchSize": 0,
                    "writeBatchTimeout": "00:00:00"
                },
                "translator": {
                    "type": "TabularTranslator",
                    "columnMappings": "COMPANY:COMPANY, UPSD_ORDN:UPSD_ORDN, UPSD_BOL:UPSD_BOL"
                }
            },
            "inputs": [
                {
                    "name": "AnalyticsDB-SHIPUPS_03shp-01src_FC-915PM"
                }
            ],
            "outputs": [
                {
                    "name": "AnalyticsDB-SHIPUPS_03shp-02dst_AZ-915PM"
                }
            ],
            "policy": {
                "timeout": "1.00:00:00",
                "concurrency": 1,
                "executionPriorityOrder": "NewestFirst",
                "style": "StartOfInterval",
                "retry": 3,
                "longRetry": 0,
                "longRetryInterval": "00:00:00"
            },
            "scheduler": {
                "frequency": "Day",
                "interval": 1,
                "offset": "04:15:00"
            },
            "name": "915PM-SHIPUPS-fc-copy->[imp_fc]_[SHIP_UPS_IntDtl_Tracking]"
        },
        {
            "type": "Copy",
            "typeProperties": {
                "source": {
                    "type": "SqlSource",
                    "sqlReaderQuery": "$$Text.Format('select distinct ups.UPSD_BOL, ups.UPSD_BOL from imp_fc.SHIP_UPS_IntDtl_Tracking ups LEFT JOIN olcm.SHIP_Tracking st ON ups.UPSD_BOL = st.SHIP_SYS_TRACK_NUM WHERE st.SHIP_SYS_TRACK_NUM IS NULL', WindowStart, WindowEnd)"
                },
                "sink": {
                    "type": "SqlSink",
                    "writeBatchSize": 0,
                    "writeBatchTimeout": "00:00:00"
                },
                "translator": {
                    "type": "TabularTranslator",
                    "columnMappings": "UPSD_BOL:SHIP_SYS_TRACK_NUM, UPSD_BOL:SHIP_TRACK_NUM"
                }
            },
            "inputs": [
                {
                    "name": "AnalyticsDB-SHIPUPS_06shp-01src_AZ-915PM"
                }
            ],
            "outputs": [
                {
                    "name": "AnalyticsDB-SHIPUPS_06shp-02dst_AZ-915PM"
                }
            ],
            "policy": {
                "timeout": "1.00:00:00",
                "concurrency": 1,
                "executionPriorityOrder": "NewestFirst",
                "style": "StartOfInterval",
                "retry": 3,
                "longRetryInterval": "00:00:00"
            },
            "scheduler": {
                "frequency": "Day",
                "interval": 1,
                "offset": "04:15:00"
            },
            "name": "915PM-SHIPUPS-AZ-update->[olcm]_[SHIP_Tracking]"
        }
    ],
    "start": "2017-08-22T03:00:00Z",
    "end": "2099-12-31T08:00:00Z",
    "isPaused": false,
    "hubName": "adf-tm-prod-01_hub",
    "pipelineMode": "Scheduled"
}

} }

Have you seen this link ? 您看到此链接了吗?

They get the same error message and suggest using AzureTableSink instead of SqlSink 他们得到相同的错误消息,并建议使用AzureTableSink而不是SqlSink

"sink": {
                    "type": "AzureTableSink",
                    "writeBatchSize": 0,
                    "writeBatchTimeout": "00:00:00"
                }

It would make sense for you too since your 2nd copy activity is Azure to Azure 这对您也很有意义,因为您的第二个副本活动是从Azure到Azure

It could be a red herring but I'm pretty sure "tableName" is a require entry in the typeProperties for a sqlSource. 可能是个红色鲱鱼,但我很确定“ tableName”是sqlSource的typeProperties中的必需项。 Yours is missing this for the input dataset. 您的输入数据集缺少此功能。 Appreciate you have a join in the sqlReaderQuery so probably best to put a dummy (but real) table name in there. 意识到您在sqlReaderQuery中有一个联接,因此最好在其中放置一个虚拟(但真实)表名。 Btw, not clear why you are using $$Text.Format and WindowStart/WindowEnd on your queries if you're not transposing these values into the query; 顺便说一句,如果您不将这些值转换为查询,则不清楚为什么在查询中使用$$ Text.Format和WindowStart / WindowEnd。 you could just put the query between double quotes. 您可以将查询放在双引号之间。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 在复制活动中使用 JArrays - Azure 数据工厂 - Working with JArrays in copy activity - Azure Data Factory Azure 数据工厂复制活动 JSON 数据类型转换问题 - Azure Data Factory copy activity JSON data type conversion issue 如何在 Azure 数据工厂复制活动中设置 Content-Type? - How to set Content-Type in Azure Data Factory Copy Activity? Azure 数据工厂 GetMetadata 活动 - Azure Data Factory GetMetadata Activity 如何在 Azure 数据工厂中使用复制数据活动将 xml 解析为 json 时删除转义字符? - How to remove escaped character when parsing xml to json with copy data activity in Azure Data Factory? 带有 Rest API 的 Azure 数据工厂 V2 复制活动为嵌套的 JSON 提供一行 - Azure Data Factory V2 Copy Activity with Rest API giving one row for nested JSON Azure 数据工厂将 JSON 复制到表中的一行 - Azure Data Factory Copy JSON to a row in a table 使用 Azure 数据工厂将嵌套的 JSON 复制到 Azure sql - Copy nested JSON to Azure sql with Azure Data Factory 是否可以在数据工厂复制活动中展平 json 文件? - Is it possible to flatten json file in Data factory copy activity? 将活动数据工厂 V2 集合引用复制到字符串类型 - Copy Activity Data Factory V2 Collection Refrence to String type
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM