[英]How to Read Data from CSV row by row and store in database
我有一个包含 20 条记录的 csv 文件,这些记录需要存储在数据库内的表中。 我正在尝试逐行读取每一行,但不确定如何将其传递给一个检查与数据库的连接并存储在其中的函数。 我为数据库的连接对象创建了一个单独的配置文件。
我应该如何逐行读取 csv并将每一行数据传递给函数和存储它并对 csv 的每一行执行相同的操作。 我的代码如下:
# This variable stores the insert query to store data in database
query = """INSERT INTO product(product_id, product_name, description, product_value)
values(%s, %s, %s, %s)"""
def create_product():
data = pd.read_csv('path/to/csv')
df = pd.DataFrame(data)
data_list = []
# How to Write This part?
# How will I pass an entire row in the function call and what to have in the argument like a
# List or something
for row in df.iterrows():
# print(row)
input_data = ",".join(row)
insert_data = output_data_to_DB(query, input_data, connect_db) # Calling Function
data_list.append(insert_data)
print(data_list)
# Called Function Here
def output_data_to_DB(insert_query, output_list, conn):
try:
cur = conn.cursor()
cur.execute(insert_query, output_list)
print("row inserted with valueList : ", output_list)
output_list.commit()
cur.close()
return ""
except Exception as e:
connect_db.rollback()
cur.close
我将不胜感激任何帮助。 我对python程序不太熟悉。
参考: https ://www.listendata.com/2019/06/pandas-read-csv.html
import pandas as pd
# read csv
data = pd.read_csv("your-file.csv")
# read csv and skip the header
data = pd.read_csv("your-file.csv", header = 1)
# read csv, define col names
data = pd.read_csv("your-file.csv", skiprows=1, names=['product_id', 'product_name'])
for row in data.iterrows():
print(row)
# process row value as you want
res = output_data_to_DB(query, res, connect_db)
csv
库足以和更简单地将每一行数据传递给函数。
def create_product():
data_list = []
with open('your-file.csv', newline='') as csvfile:
reader = csv.reader(csvfile)
next(reader) # discard header
for row in reader:
print(row) # e.g. `['foo', 'bar']`
insert_data = output_data_to_DB(query, row, connect_db)
data_list.append(insert_data)
print(data_list)
向id
等列添加自动递增值的一些选项可以是:
id
列的BULK INSERT
AUTOINCREMENT
关键字
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.