[英]Python faster way to insert each row of dataframe as one table to mysql
I have dataframe like this with more than 300 rows我有这样的 dataframe 超过 300 行
name price percent volume Buy Sell
1 BID 41.30 -0.36 62292.0 604.0 6067.0
2 BVH 49.00 -1.01 57041.0 3786.0 3510.0
3 CTD 67.80 6.94 68098.0 2929.0 576.0
4 CTG 23.45 0.43 298677.0 16965.0 20367.0
5 EIB 18.20 -0.27 10517.0 306.0 210.0
For each name I create 1 table in mysql.对于每个名称,我在 mysql 中创建 1 个表。 Here is my code so far.
到目前为止,这是我的代码。
vn30 = vn30_list.iloc[:, [10,13,12,15,25,26]].dropna(how='all').fillna(0)
data = vn30_list.iloc[:, [13,12,15,25,26]].dropna(how='all').fillna(0)
data.columns = ['gia','percent','khoiluong','nnmua','nnban']
en = sa.create_engine('mysql+mysqlconnector://...', echo=True)
#insert into mysql
for i in range(30):
macp = vn30.iloc[i][0].lower()
#print(row)
compare_item = vn30.iloc[i][1]
if compare_item == data.iloc[i][0]:
row = data.iloc[i:i + 1, :]
#print(row)
row.to_sql(name=str(macp), con=en, if_exists= "append", index=False,schema="online")
Is there anyway to make it faster for 300 rows?有没有办法让它更快地处理 300 行? Thank you so much.
太感谢了。 And sorry for my English.
对不起我的英语。
# import the module
from sqlalchemy import create_engine
# create sqlalchemy engine
engine = create_engine("mysql+pymysql://{user}:{pw}@localhost/{db}".format(user="root",pw="12345",db="employee"))
# Insert whole DataFrame into MySQL
data.to_sql('book_details', con = engine, if_exists = 'append', chunksize = 1000)
You can get all the details here: https://www.dataquest.io/blog/sql-insert-tutorial/您可以在此处获取所有详细信息: https://www.dataquest.io/blog/sql-insert-tutorial/
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.