[英]MySQL timeout with azure data factory copy
I am using azure data factory to copy data from MySQL server as source. 我正在使用azure数据工厂从MySQL服务器复制数据作为源。 The data is big in size.
数据量很大。 When I setup the pipeline and execute it:
当我设置管道并执行它时:
MySQL: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding.
I think this can be solved with this answer . 我认为可以通过此答案解决 。 How can I add this configuration to my data factory pipeline using MySQL as source?
如何使用MySQL作为源将此配置添加到数据工厂管道中?
Update : I am using a normal script to copy data from on-premise MySQL to SQL data warehouse. 更新 :我正在使用普通脚本将本地MySQL的数据复制到SQL数据仓库。 The MySQL query is simple select:
select * from mytable;
MySQL查询很简单:
select * from mytable;
Complete Error: 完成错误:
Copy activity encountered a user error at Source side: GatewayNodeName=MYGATEWAY,ErrorCode=UserErrorFailedMashupOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message='Type=Microsoft.Data.Mashup.MashupValueException,Message=MySQL: Timeout expired.
复制活动在源端遇到用户错误:GatewayNodeName = MYGATEWAY,ErrorCode = UserErrorFailedMashupOperation,'Type = Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message ='Type = Microsoft.Data.Mashup.MashupValueException,Message = MySQL:超时已过期。 The timeout period elapsed prior to completion of the operation or the server is not responding.,Source=Microsoft.MashupEngine,',Source=,'.
在操作完成之前超时时间已过,或者服务器未响应。,Source = Microsoft.MashupEngine,',Source =,'。
Well, if this issue is about default time out configuration ,you can add these scripts in "activities"
in your pipeline settings to set timeout to 1 hour: 好吧,如果此问题与默认超时配置有关,则可以在管道设置的
"activities"
中添加以下脚本,以将超时设置为1小时:
"Policy": {
"concurrency": 1,
"timeout": "01:00:00"
}
----------Update---------- ----------更新----------
the whole JSON of pipeline configuration like this: 管道配置的整个JSON如下所示:
{
"name": "ADFTutorialPipelineOnPrem",
"properties": {
"description": "This pipeline has one Copy activity that copies data from an on-prem SQL to Azure blob",
"activities": [
{
"name": "CopyFromSQLtoBlob",
"description": "Copy data from on-prem SQL server to blob",
"type": "Copy",
"inputs": [
{
"name": "EmpOnPremSQLTable"
}
],
"outputs": [
{
"name": "OutputBlobTable"
}
],
"typeProperties": {
"source": {
"type": "SqlSource",
"sqlReaderQuery": "select * from emp"
},
"sink": {
"type": "BlobSink"
}
},
"Policy": {
"concurrency": 1,
"executionPriorityOrder": "NewestFirst",
"style": "StartOfInterval",
"retry": 0,
"timeout": "01:00:00"
}
}
],
"start": "2016-07-05T00:00:00Z",
"end": "2016-07-06T00:00:00Z",
"isPaused": false
}
}
This following sample assumes you have created a table “MyTable” in MySQL and it contains a column called “timestampcolumn” for time series data.Setting “external”: ”true” informs the Data Factory service that the table is external to the data factory and is not produced by an activity in the data factory.: 下面的示例假定您在MySQL中创建了一个表“ MyTable”,并且其中包含一个名为“ timestampcolumn”的时间序列数据列。设置“ external”:“ true”将通知Data Factory服务该表在数据工厂外部并且不是由数据工厂中的活动产生的:
{
"name": "MySqlDataSet",
"properties": {
"published": false,
"type": "RelationalTable",
"linkedServiceName": "OnPremMySqlLinkedService",
"typeProperties": {},
"availability": {
"frequency": "Hour",
"interval": 1
},
"external": true,
"policy": {
"externalData": {
"retryInterval": "00:01:00",
"retryTimeout": "01:00:00",
"maximumRetry": 3
}
}
}
}
More details about how to create pipeline for Azure data factory, refer to this document 有关如何为Azure数据工厂创建管道的更多详细信息,请参阅此文档
More about the whole tutorial to move data from on-premise MySQL to Azure Data Factory,refer to this link . 有关将数据从本地MySQL迁移到Azure Data Factory的整个教程的更多信息,请参考此链接 。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.