[英]Extracting and Transforming Data from local MySQL to Azure Synapse Data Warehouse
I'm trying to setup a Demo Data Warehouse in Azure Synapse.我正在尝试在 Azure Synapse 中设置演示数据仓库。 I would like to extract data from a local MySQL database, transform and aggregate some data and store it in fact-/dimension tables in Azure Synapse Analytics.
我想从本地 MySQL 数据库中提取数据,转换和聚合一些数据并将其存储在 Azure Synapse Analytics 中的事实/维度表中。
Currently I have an instance of Azure SQL Data Warehouse and Data Factory.目前我有一个 Azure SQL 数据仓库和数据工厂的实例。 I created a connection to my MySQL database in Data Factory and my thought was, i can use this connector as input for a new Data Flow, which transforms the dataset and stores it to my destination dataset, which is linked to my Azure Synapse Data Warehouse.
我在数据工厂中创建了与我的 MySQL 数据库的连接,我的想法是,我可以使用此连接器作为新数据流的输入,它将数据集转换并将其存储到我的目标数据集,该数据集链接到我的 Azure Synapse 数据仓库.
The Problem is, Data Factory just support some Azure Services like Azure Data Lake or Azure SQL Database as Source for a new Data Flow. The Problem is, Data Factory just support some Azure Services like Azure Data Lake or Azure SQL Database as Source for a new Data Flow.
What would be the best practice for solving this Problem?解决这个问题的最佳实践是什么? Create an Instance of Azure SQL Database, copy the Data from the local MySQL Database to the Azure SQL Database and use it then as Source for a new Data Flow?
Create an Instance of Azure SQL Database, copy the Data from the local MySQL Database to the Azure SQL Database and use it then as Source for a new Data Flow?
Best practice here is to use the Copy Activity in an ADF pipeline to land the data from MySQL into Parquet in Blob or ADLS G2, then transform the data using Data Flows.此处的最佳实践是使用 ADF 管道中的复制活动将数据从 MySQL 放入 Blob 或 ADLS G2 中的 Parquet,然后使用数据流转换数据。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.