简体   繁体   中英

Multiple inserts at a time using SQLAlchemy with Postgres

My current single insert code is:

def add_item(a_email,a_first,a_last,a_address,a_tag):
    query = db.insert(table_items).values(email=a_email,firstname=a_first,lastname=a_last,address=a_address,tag=a_tag)
    connection.execute(query)

Problem is, I'm inserting millions of entries and it's taking a LONG time doing it one by one. What is the best way to insert, say, 10 entries at a time? Say I have a list:

my_items = []
my_items.append(["test@test.com","John","Doe","1234 1st Street","Tag1"])

Pretend I appended 10 items into my_items, how can I get all of them into my Postgres database in one go?

try using insert from sqlalchemy.dialects.postgresql , you will need to pass a list of dictionaries ( with key-value corresponding to your column_name and the data for that row. it basically converts it to a single insert statement for all your rows. An example would be

from sqlalchemy.dialects.postgresql import insert```
...
Data=[{'column1':value1},{'column1':valu1}]
stmt=insert(Table,Data)
db.session().execute(stmt)

As mentioned in the comment you can also use the COPY command

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM