I am new to python. I have a use case in which I have to parse a CSV file and then have to insert the rows into DB.
Here is my code
with open(targetFileName, 'r') as csvfile:
# creating a csv reader object
csvreader = csv.reader(csvfile)
for row in csvreader:
print(row)
with conn.cursor() as cur:
cur.execute("insert into test (first, second) values(%s,%s)",row)
conn.commit()
Here I am executing and committing query row by row. I want to do batch commit rather than committing each row. Is there any way to do this?
Haven't tested it, but can't you put the for loop inside the 2nd with statement, and the commit after the for loop?
with open(targetFileName, 'r') as csvfile:
# creating a csv reader object
csvreader = csv.reader(csvfile)
with conn.cursor() as cur:
for row in csvreader:
print(row)
cur.execute("insert into test (first, second) values(%s,%s)",row)
conn.commit()
You could also consider using the execute_batch command in Psycopg2:
http://initd.org/psycopg/docs/extras.html#fast-execution-helpers
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.