[英]How to Read Data from CSV row by row and store in database
我有一個包含 20 條記錄的 csv 文件,這些記錄需要存儲在數據庫內的表中。 我正在嘗試逐行讀取每一行,但不確定如何將其傳遞給一個檢查與數據庫的連接並存儲在其中的函數。 我為數據庫的連接對象創建了一個單獨的配置文件。
我應該如何逐行讀取 csv並將每一行數據傳遞給函數和存儲它並對 csv 的每一行執行相同的操作。 我的代碼如下:
# This variable stores the insert query to store data in database
query = """INSERT INTO product(product_id, product_name, description, product_value)
values(%s, %s, %s, %s)"""
def create_product():
data = pd.read_csv('path/to/csv')
df = pd.DataFrame(data)
data_list = []
# How to Write This part?
# How will I pass an entire row in the function call and what to have in the argument like a
# List or something
for row in df.iterrows():
# print(row)
input_data = ",".join(row)
insert_data = output_data_to_DB(query, input_data, connect_db) # Calling Function
data_list.append(insert_data)
print(data_list)
# Called Function Here
def output_data_to_DB(insert_query, output_list, conn):
try:
cur = conn.cursor()
cur.execute(insert_query, output_list)
print("row inserted with valueList : ", output_list)
output_list.commit()
cur.close()
return ""
except Exception as e:
connect_db.rollback()
cur.close
我將不勝感激任何幫助。 我對python程序不太熟悉。
參考: https ://www.listendata.com/2019/06/pandas-read-csv.html
import pandas as pd
# read csv
data = pd.read_csv("your-file.csv")
# read csv and skip the header
data = pd.read_csv("your-file.csv", header = 1)
# read csv, define col names
data = pd.read_csv("your-file.csv", skiprows=1, names=['product_id', 'product_name'])
for row in data.iterrows():
print(row)
# process row value as you want
res = output_data_to_DB(query, res, connect_db)
csv
庫足以和更簡單地將每一行數據傳遞給函數。
def create_product():
data_list = []
with open('your-file.csv', newline='') as csvfile:
reader = csv.reader(csvfile)
next(reader) # discard header
for row in reader:
print(row) # e.g. `['foo', 'bar']`
insert_data = output_data_to_DB(query, row, connect_db)
data_list.append(insert_data)
print(data_list)
向id
等列添加自動遞增值的一些選項可以是:
id
列的BULK INSERT
AUTOINCREMENT
關鍵字
聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.