简体   繁体   中英

Pandas dataframe to PostgreSQL table using psycopg2 without SQLAlchemy?

I'd like to write a Pandas dataframe to PostgreSQL table without using SQLAlchemy .

The table name should correspond to the pandas variable name, or replace the table if already exists. Data types need to match as well.

I'd like to avoid SQLAlchemy's to_sql function for several reasons.

import pandas as pd
from getpass import getpass
import psycopg2

your_pass = getpass(prompt='Password: ', stream=None)
conn_cred = {
    'host': your_host,
    'port': your_port,
    'dbname': your_dbname,
    'user': your_user,
    'password': your_pass
}
conn = psycopg2.connect(**conn_cred)
conn.autocommit = True

my_data = {'col1': [1, 2], 'col2': [3, 4]}

def store_dataframe_to_postgre(df, schema, active_conn):
    # df = pandas dataframe to store as a table
    # schema = schema for the table
    # active_conn = open connection to a PostgreSQL db
    # ...
    # Bonus: require explicit commit here, even though conn.autocommit = True


store_dataframe_to_postgre(my_data, 'my_schema', conn)

This should be the result in the Postgre db:

SELECT * FROM my_schema.my_data;
   col1  col2
     1     3
     2     4

you can try but this code in your:

 cursor = conn.cursor()  
 cur.copy_from(df, schema , null='', sep=',', columns=(my_data))

reference code: copy dataframe to postgres table with column that has defalut value

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM