简体   繁体   English

使用sqlalchemy将字典列表批量复制到Postgres

[英]Doing a bulk copy of dictionary list to postgres using sqlalchemy

I have a list of dictionary list_dict like the one below: 我有一个字典list_dict的列表,如下所示:

[{u'default_bhp': False, u'price_period': u'Monthly'},{u'default_bhp': False, u'price_period': u'Yearly'}]

At the moment I am inserting this to my db using: 目前,我使用以下命令将其插入到数据库中:

conn = engine.connect()
conn.execute(ModelClass.__table__.insert(), list_dict)

I just wanted to check if there is a faster way to insert data to the database, because I have a huge amount of data. 我只是想检查是否有更快的方法将数据插入数据库,因为我有大量的数据。

Can we use a bulk copy or something here? 我们可以在这里使用批量copy吗?

How to use use_batch_mode functionality?? 如何使用use_batch_mode功能? Something like below: 如下所示:

 engine = create_engine('postgresql+psycopg2://postgres:postgres@localhost/test_db', use_batch_mode=True)
conn = engine.connect()
    conn.execute_batch(ModelClass.__table__.insert(), list_dict)

Consider using bulk_insert_mappings (if you don't know about this already), this is probably closest to what you want to achieve. 考虑使用bulk_insert_mappings (如果您还不知道这一点),这可能与您想要实现的目标最接近。

But if you're actually have a lot of data, eager unpacking into list of dicts may be not the good way to go, so you may need grouper/chunk management on top of bulk_insert 但是,如果您实际上有很多数据,那么急于将其分解为字典列表可能不是一个好方法,因此您可能需要在bulk_insert之上进行分组/块管理

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM