简体   繁体   English

如何使用 .net sdk 在 Adf 复制活动中使用查询?

[英]How to use query in Adf copy activity by using .net sdk?

I have created a copy activity pipeline using Portal UI is working fine with query as a source.我已经使用 Portal UI 创建了一个复制活动管道,它以查询作为源工作正常。 when I tried this with .net sdk I don't know how to use query inside copy activity, can anyone help me with this.当我用 .net sdk 尝试这个时,我不知道如何在复制活动中使用查询,任何人都可以帮我解决这个问题。

Below example shows create “Copy Activity” using .net sdk下面的示例显示使用 .net sdk 创建“复制活动”

// Create a Pipeline with Copy Activity  
Console.WriteLine("Creating pipeline " + pipelineName + "...");  
PipelineResource pipeline = new PipelineResource  
{  
    Activities = new List<Activity>  
    {  
        new CopyActivity  
        {  
            Name = "CopyFromBlobToCosmosDB",  
            Inputs = new List<DatasetReference>  
            {  
                new DatasetReference()  
                {  
                    ReferenceName = blobDatasetName  
                }  
            },  
            Outputs = new List<DatasetReference>  
            {  
                new DatasetReference  
                {  
                    ReferenceName = cosmosDbDatasetName  
                }  
            },  
            Source = new BlobSource { },  
            Sink = new DocumentDbCollectionSink { }  
        }  
    }  
};  
client.Pipelines.CreateOrUpdate(resourceGroup, dataFactoryName, pipelineName, pipeline);  
Console.WriteLine(SafeJsonConvert.SerializeObject(pipeline, client.SerializationSettings));

You can find code and more information in this article by Sarathlal Saseendran.您可以在 Sarathlal Saseendran 撰写的这篇文章中找到代码和更多信息。

For Query Copy Activity in dot.net, It depends on what is Source and Sink in Copy Activity.对于dot.net中的Query Copy Activity,取决于Copy Activity中的Source和Sink是什么。 There are number of API's for different Source and Sink.有许多 API 用于不同的 Source 和 Sink。

API for MySQL as Sink API 为 MySQL 作为接收器

classazure.mgmt.datafactory.models.AzureMySqlSink(*, additional_properties=None, write_batch_size=None, write_batch_timeout=None, sink_retry_count=None, sink_retry_wait=None, max_concurrent_connections=None, pre_copy_script=None, **kwargs)

Bases: azure.mgmt.datafactory.models._models_py3.CopySink基地: azure.mgmt.datafactory.models._models_py3.CopySink

A copy activity Azure MySql sink.复制活动 Azure MySql 汇。

All required parameters must be populated in order to send to Azure.必须填写所有必需的参数才能发送到 Azure。

Parameters参数

additional_properties (dict[str, object]) – Unmatched properties from the message are deserialized this collection additional_properties (dict[str, object]) – 消息中不匹配的属性被反序列化到这个集合中

write_batch_size (object) – Write batch size. write_batch_size (object) – 写入批量大小。 Type: integer (or Expression with resultType integer), minimum: 0.类型:integer(或带有 resultType 整数的表达式),最小值:0。

write_batch_timeout (object) – Write batch timeout. write_batch_timeout (object) – 写入批处理超时。 Type: string (or Expression with resultType string), pattern: ((d+).)?(dd):(60|([0-5][0-9])):(60|([0-5][0-9])).类型:字符串(或带有 resultType 字符串的表达式),模式:((d+).)?(dd):(60|([0-5][0-9])):(60|([0-5] [0-9]))。

sink_retry_count (object) – Sink retry count. sink_retry_count (object) – 接收器重试计数。 Type: integer (or Expression with resultType integer).类型:integer(或带有 resultType 整数的表达式)。

sink_retry_wait (object) – Sink retry wait. sink_retry_wait (object) – 接收器重试等待。 Type: string (or Expression with resultType string), pattern: ((d+).)?(dd):(60|([0-5][0-9])):(60|([0-5][0-9])).类型:字符串(或带有 resultType 字符串的表达式),模式:((d+).)?(dd):(60|([0-5][0-9])):(60|([0-5] [0-9]))。

max_concurrent_connections (object) – The maximum concurrent connection count for the sink data store. max_concurrent_connections (object) – 接收器数据存储的最大并发连接数。 Type: integer (or Expression with resultType integer).类型:integer(或带有 resultType 整数的表达式)。

type (str) – Required. type (str) – 必需。 Constant filled by server.常量由服务器填充。

pre_copy_script (object) – A query to execute before starting the copy. pre_copy_script (object) – 在开始复制之前执行的查询。 Type: string (or Expression with resultType string).类型:字符串(或带有 resultType 字符串的表达式)。

For more API's follow this link有关更多 API,请点击此链接

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 SQL 在 ADF 中使用脚本活动进行查询 - SQL Query using script activity in ADF ADF - 如何在复制活动中禁用映射 - ADF - How to disable mapping in Copy Activity ADF - 迭代 select Output 到复制活动 - ADF - Iterative select Output to Copy activity 使用ADF Copy Activity,我需要将.zip中的.seq文件解压并转换为csv - Using ADF Copy Activity, I need to unzip and convert .seq files in .zip to csv ADF 复制活动不复制所有数据 - ADF Copy Activity Not copying all data 如何在 json 中为 azure 数据工厂 (adf) 中的复制活动添加默认日期,同时在 SQL 源和接收器之间动态映射列 - How to Add default date in json for copy activity in azure data factory(adf) while dynamic mapping of columns between SQL source and sink 当 CSV 在 header 列中有空间时,ADF 复制活动失败 CSV 到 Parquet - ADF Copy Activity Fails CSV to Parquet when CSV has space in header column ADF - 访问脚本任务活动 output 中使用的 #table 或临时表,以作为源在复制活动中使用 - ADF - Access #table or Temp table used in Script task activity output to be used in Copy activity as source Azure 数据工厂管道 - 将单值源查询 output 存储为变量,然后在复制数据活动中使用 - Azure Data Factory Pipeline - Store single-value source query output as a variable to then use in Copy Data activity 如何使用 ADF 将列转换为数组 - How to transform a column into an array using ADF
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM