简体   繁体   中英

How to use multi threading to insert millions of rows into mysql database from a text file using thread and avoid duplication?

I have a text file with millions of lines and I have to insert data into MySQL database. I know INSERT operation will lock the table until commit is completed. Is there any way to do so in python?

I want proper solution to this problem. I will have system with proper configuration which can handle this but I am not sure what I can do to complete this task within minutes instead of taking hours.

Any response will be greatly appreciated!

Thank you,

Is using PostgreSQL is solution to this problem? I think PostgreSQL database doesn't lock most concurrent transactions. refer to below link

https://www.citusdata.com/blog/2018/02/15/when-postgresql-blocks/

If you can trust yourself to revert the isolation level to your desirable level, you can temporarily put yourself into READ_UNCOMMITTED isolation level, that way no locks will be applied to your tables on insert and you can experience the raw throughput of your SQL server of choice, assuming your original SQL server uses lock-based concurrency control.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM