简体   繁体   中英

pandas DataFrame.to_sql and nan values

I am trying to use pandas DataFrame.to_sql to insert values in a table of my Postgres database. I have some nan values in a column of integers that does not belong to any constraint.

I get the following error :

sqlalchemy.exc.DataError: (DataError) integer out of range

When I substitute nan values with zéros, insertion happens as wanted, so it is really nan values that are to blame for my error.

I have tried converting nan values to None and to np.nan , but I get the same error. So the question is: what nan format do I need so that pd.to_sql handles it correctly?

My restrictions are : python 2.7 pandas 0.14.1 , sqlalchemy 0.9.8 , Postgres 9.2

The problem is with your pandas version: 0.14.1.

Starting with pandas 0.15, to_sql supports writing NaN values.

You can try upgrading your pandas.

Also, for now you can convert NAN to NONE like:

df = df.astype(object).where(pd.notnull(df), None)

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM