[英]Bulk Insert Data from List of Dictionaries into Postgresql database [Faster Way]?
For Example: 例如:
books = [{'name':'pearson', 'price':60, 'author':'Jesse Pinkman'},{'name':'ah publications', 'price':80, 'author':'Gus Fring'},{'name':'euclidean', 'price':120, 'author':'Skyler White'},{'name':'Nanjial', 'price':260, 'author':'Saul Goodman'}]
I need to insert each dictionary into already created table by just taking 'author','price' I have like 100k records to be inserted into table. 我只需要通过“作者”,“价格”将每本词典插入已创建的表中,就好像有10万条记录要插入到表中一样。 Right now what I am doing is to loop through the list of dictionaries and take the required key/value pair and insert one by one 现在,我正在做的是遍历字典列表,获取所需的键/值对,然后一一插入
def insert_books(self, val):
cur = self.con.cursor()
sql = """insert into testtable values {}""".format(val)
cur.execute(sql)
self.con.commit()
cur.close()
for i in books:
result = i['author'],i['price']
db_g.insert_books(result) #db_g is class - connection properties
So is there a faster and easier way to bulk insert the data like 10k at a time? 那么,是否有更快,更轻松的方式一次批量插入10k数据?
i think bulk insert by dumping the whole dataframe will be much faster.. Why Bulk Import is faster than bunch of INSERTs? 我认为通过转储整个数据帧进行批量插入会快得多。 为什么批量导入比一堆INSERT快?
import sqlalchemy
def db_conn():
connection = sqlalchemy.create_engine(//connection string)
return connection
books = [{'name':'pearson', 'price':60, 'author':'Jesse Pinkman'},{'name':'ah publications', 'price':80, 'author':'Gus Fring'},{'name':'euclidean', 'price':120, 'author':'Skyler White'},{'name':'Nanjial', 'price':260, 'author':'Saul Goodman'}]
df_to_ingest = pd.DataFrame(books)
df_to_ingest = df_to_ingest([['author', 'price']])
df_to_ingest('tablename', db_conn(), if_exists='append', index=False)
Hope this helps 希望这可以帮助
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.