[英]How to perform push down optimization in Azure Data Factory for snowflake connection
Recently Microsoft launched the snowflake connection for data flow in ADF.最近微软推出了 ADF 中数据流的雪花连接。 Is there any way to turn on the push down optimization in ADF so that if my source and target is Snowflake only then instead of pulling data out of snowflake environment it should trigger a query in snowflake to do the task.有什么方法可以打开 ADF 中的下推优化,以便如果我的源和目标只是雪花,而不是从雪花环境中提取数据,它应该触发雪花中的查询来完成任务。 Like a normal ELT process instead of ETL.就像普通的 ELT 过程而不是 ETL。
Let me know if you need some more clarification.如果您需要更多说明,请告诉我。
As I understand the intent here is to fire a query from ADF on snowflake data so that possibley the data can be scrubbed ( or something similar ) .据我了解,这里的目的是从 ADF 对雪花数据发出查询,以便可能清除数据(或类似的东西)。 I see that Lookup activity also supports snowflake and probably that should help you .我看到 Lookup 活动也支持雪花,这可能对你有帮助。 My knowledge on SF is limited , but i know that you can call a proc/query from lookup activity and that should help .我对 SF 的了解有限,但我知道您可以从查找活动中调用 proc/query,这应该会有所帮助。
https://docs.microsoft.com/en-us/azure/data-factory/control-flow-lookup-activity https://docs.microsoft.com/en-us/azure/data-factory/control-flow-lookup-activity
"Lookup activity reads and returns the content of a configuration file or table. It also returns the result of executing a query or stored procedure. The output from Lookup activity can be used in a subsequent copy or transformation activity if it's a singleton value. The output can be used in a ForEach activity if it's an array of attributes." “查找活动读取并返回配置文件或表的内容。它还返回执行查询或存储过程的结果。如果是单例值,查找活动的输出可用于后续复制或转换活动。”如果输出是属性数组,则可以在 ForEach 活动中使用它。”
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.