简体   繁体   中英

Python - writing to SQL server database using sqlalchemy from a pandas dataframe

I have a pandas dataframe of approx 300,000 rows (20mb), and want to write to a SQL server database.

I have the following code but it is very very slow to execute. Wondering if there is a better way?

import pandas
import sqlalchemy

engine = sqlalchemy.create_engine('mssql+pyodbc://rea-eqx-dwpb/BIWorkArea? 
driver=SQL+Server')

df.to_sql(name='LeadGen Imps&Clicks', con=engine, schema='BIWorkArea', 
if_exists='replace', index=False)

If you want to speed up you process with writing into the sql database , you can per-setting the dtypes of the table in your database by the data type of your pandas DataFrame

from sqlalchemy import types, create_engine
d={}
for k,v in zip(df.dtypes.index,df.dtypes):
    if v=='object':
       d[k]=types.VARCHAR(df[k].str.len().max())
    elif v=='float64':
       d[k]=types.FLOAT(126)
    elif v=='int64':
       d[k] = types.INTEGER()

Then

df.to_sql(name='LeadGen Imps&Clicks', con=engine, schema='BIWorkArea', if_exists='replace', index=False,dtype=d)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM