简体   繁体   English

将字典列表中的数据批量插入到Postgresql数据库中?

[英]Bulk Insert Data from List of Dictionaries into Postgresql database [Faster Way]?

For Example: 例如:

books = [{'name':'pearson', 'price':60, 'author':'Jesse Pinkman'},{'name':'ah publications', 'price':80, 'author':'Gus Fring'},{'name':'euclidean', 'price':120, 'author':'Skyler White'},{'name':'Nanjial', 'price':260, 'author':'Saul Goodman'}]

I need to insert each dictionary into already created table by just taking 'author','price' I have like 100k records to be inserted into table. 我只需要通过“作者”,“价格”将每本词典插入已创建的表中,就好像有10万条记录要插入到表中一样。 Right now what I am doing is to loop through the list of dictionaries and take the required key/value pair and insert one by one 现在,我正在做的是遍历字典列表,获取所需的键/值对,然后一一插入

def insert_books(self, val):
    cur = self.con.cursor()
    sql = """insert into testtable values {}""".format(val)
    cur.execute(sql)
    self.con.commit()
    cur.close()

for i in books:
    result = i['author'],i['price']
    db_g.insert_books(result)   #db_g is class - connection properties

So is there a faster and easier way to bulk insert the data like 10k at a time? 那么,是否有更快,更轻松的方式一次批量插入10k数据?

i think bulk insert by dumping the whole dataframe will be much faster.. Why Bulk Import is faster than bunch of INSERTs? 我认为通过转储整个数据帧进行批量插入会快得多。 为什么批量导入比一堆INSERT快?

import sqlalchemy

def db_conn():
    connection = sqlalchemy.create_engine(//connection string)
    return connection 


books = [{'name':'pearson', 'price':60, 'author':'Jesse Pinkman'},{'name':'ah publications', 'price':80, 'author':'Gus Fring'},{'name':'euclidean', 'price':120, 'author':'Skyler White'},{'name':'Nanjial', 'price':260, 'author':'Saul Goodman'}]

df_to_ingest = pd.DataFrame(books)
df_to_ingest = df_to_ingest([['author', 'price']])

df_to_ingest('tablename', db_conn(), if_exists='append', index=False)

Hope this helps 希望这可以帮助

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 从数据框/CSV 插入或更新批量数据到 PostgreSQL 数据库 - INSERT or UPDATE bulk data from dataframe/CSV to PostgreSQL database 使用另一个词典列表更新词典列表。有更快的方法吗? - Updating list of dictionaries with another list of dictionaries. Is there a faster way? 将数据从Redis哈希转储到Postgresql表的更快方法 - Faster way to dump data from redis hash to postgresql table 使用SQLAlchemy更快地执行批量插入,同时避免重复的方法 - Faster way to perform bulk insert, while avoiding duplicates, with SQLAlchemy 从字典列表中构建 Pandas.Dataframe 比循环更快的方法? [Python 3.9] - A faster way of building a Pandas.Dataframe from list of dictionaries than loop? [Python 3.9] 将字典的Python列表插入PSQL数据库 - Insert Python list of dictionaries into PSQL database 将python列表插入PostgreSQL数据库 - Insert python list into PostgreSQL database 如何使用python 3更快地将批量csv数据插入SQL Server - How to insert bulk csv data into SQL Server using python 3 faster 有没有更快的方法将大量字典转换为 pandas DataFrame? - Is there a faster way to convert a large list of dictionaries into a pandas DataFrame? Psycopg2、Postgresql、Python:批量插入的最快方法 - Psycopg2, Postgresql, Python: Fastest way to bulk-insert
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM