简体   繁体   English

MySQL超时与Azure数据工厂副本

[英]MySQL timeout with azure data factory copy

I am using azure data factory to copy data from MySQL server as source. 我正在使用azure数据工厂从MySQL服务器复制数据作为源。 The data is big in size. 数据量很大。 When I setup the pipeline and execute it: 当我设置管道并执行它时:

MySQL: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.

I think this can be solved with this answer . 我认为可以通过此答案解决 How can I add this configuration to my data factory pipeline using MySQL as source? 如何使用MySQL作为源将此配置添加到数据工厂管道中?

Update : I am using a normal script to copy data from on-premise MySQL to SQL data warehouse. 更新 :我正在使用普通脚本将本地MySQL的数据复制到SQL数据仓库。 The MySQL query is simple select: select * from mytable; MySQL查询很简单: select * from mytable; Complete Error: 完成错误:

Copy activity encountered a user error at Source side: GatewayNodeName=MYGATEWAY,ErrorCode=UserErrorFailedMashupOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message='Type=Microsoft.Data.Mashup.MashupValueException,Message=MySQL: Timeout expired. 复制活动在源端遇到用户错误:GatewayNodeName = MYGATEWAY,ErrorCode = UserErrorFailedMashupOperation,'Type = Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message ='Type = Microsoft.Data.Mashup.MashupValueException,Message = MySQL:超时已过期。 The timeout period elapsed prior to completion of the operation or the server is not responding.,Source=Microsoft.MashupEngine,',Source=,'. 在操作完成之前超时时间已过,或者服务器未响应。,Source = Microsoft.MashupEngine,',Source =,'。

Well, if this issue is about default time out configuration ,you can add these scripts in "activities" in your pipeline settings to set timeout to 1 hour: 好吧,如果此问题与默认超时配置有关,则可以在管道设置的"activities"中添加以下脚本,以将超时设置为1小时:

"Policy": {
       "concurrency": 1,
       "timeout": "01:00:00"
 }

----------Update---------- ----------更新----------

the whole JSON of pipeline configuration like this: 管道配置的整个JSON如下所示:

{
     "name": "ADFTutorialPipelineOnPrem",
     "properties": {
     "description": "This pipeline has one Copy activity that copies data from an on-prem SQL to Azure blob",
     "activities": [
       {
         "name": "CopyFromSQLtoBlob",
         "description": "Copy data from on-prem SQL server to blob",
         "type": "Copy",
         "inputs": [
           {
             "name": "EmpOnPremSQLTable"
           }
         ],
         "outputs": [
           {
             "name": "OutputBlobTable"
           }
         ],
         "typeProperties": {
           "source": {
             "type": "SqlSource",
             "sqlReaderQuery": "select * from emp"
           },
           "sink": {
             "type": "BlobSink"
           }
         },
         "Policy": {
           "concurrency": 1,
           "executionPriorityOrder": "NewestFirst",
           "style": "StartOfInterval",
           "retry": 0,
           "timeout": "01:00:00"
         }
       }
     ],
     "start": "2016-07-05T00:00:00Z",
     "end": "2016-07-06T00:00:00Z",
     "isPaused": false
   }
 }

This following sample assumes you have created a table “MyTable” in MySQL and it contains a column called “timestampcolumn” for time series data.Setting “external”: ”true” informs the Data Factory service that the table is external to the data factory and is not produced by an activity in the data factory.: 下面的示例假定您在MySQL中创建了一个表“ MyTable”,并且其中包含一个名为“ timestampcolumn”的时间序列数据列。设置“ external”:“ true”将通知Data Factory服务该表在数据工厂外部并且不是由数据工厂中的活动产生的:

{
        "name": "MySqlDataSet",
        "properties": {
            "published": false,
            "type": "RelationalTable",
            "linkedServiceName": "OnPremMySqlLinkedService",
            "typeProperties": {},
            "availability": {
                "frequency": "Hour",
                "interval": 1
            },
            "external": true,
            "policy": {
                "externalData": {
                    "retryInterval": "00:01:00",
                    "retryTimeout": "01:00:00",
                    "maximumRetry": 3
                }
            }
        }
    }

More details about how to create pipeline for Azure data factory, refer to this document 有关如何为Azure数据工厂创建管道的更多详细信息,请参阅此文档

More about the whole tutorial to move data from on-premise MySQL to Azure Data Factory,refer to this link . 有关将数据从本地MySQL迁移到Azure Data Factory的整个教程的更多信息,请参考此链接

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Azure 数据工厂 - 从 MySQL 复制数据 错误 [08S01] [Microsoft][MySQL] (17) 与服务器通信时出错 - Azure Data Factory - Copy data from MySQL ERROR [08S01] [Microsoft][MySQL] (17) Error during communication with the server MYSQL 变更数据捕获 (CDC) - Azure 服务(Azure 数据工厂) - MYSQL Change Data Capture(CDC) - Azure Services (Azure data factory) Azure数据工厂 - 从MySql或Blob存储中提取数据时出错 - Azure Data Factory - Error ingesting data from MySql or Blob Storage 如何在 Azure 数据工厂中执行 MySQL 删除查询? - How to execute a MySQL delete query in Azure Data Factory? 是否可以将MySQL服务器连接到Azure Data Factory for MAC - Is there any alternative to connect MySQL server to Azure Data Factory for MAC Azure 数据工厂 - 连接到本地 Mysql,Linux 上的受保护服务 - Azure data factory - Connect to local Mysql, a protected service on Linux Azure管道复制数据活动,用于将数据从Azure MSSQL复制到Azure MySQL - Azure pipeline copy data activity for copying data from Azure MSSQL to Azure MySQL (我如何)使用证书将数据工厂连接到(prem)mysql DB连接 - (how do I)Azure data Factory to (on prem) mysql DB connection using cert 更新接收器表中的值-azure 数据工厂 - updating a value in sink table -azure data factory Mysql数据复制 - Mysql Data copy
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM