简体   繁体   中英

Where to host a data ingestion ETL ? input data (csv file) automatically from Azure blob storage to Azure Posgresql

I would like to do a daily ingesting job that takes a CSV file from blob storage and put it integrate it into a PostgreSQL database. I have the constraint to use python. Which solution do you recommend me to use for building/hosting my ETL solution?

Have a nice day:) Additional information: The size and shape of the CSV file are 1.35 GB, (1292532, 54). I will push to the database only 12 columns out of 54.

You can try to use Azure Data Factory to achieve this. New a Copy Data activity, source is your csv and sink is PostgreSQL database. In the Mapping setting, just select the columns you need. Finally, create a schedule trigger to run it.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM