简体   繁体   中英

Import csv file data to plsql table using python

I have a csv file which contains 60000 rows. I need to insert this data into postgres database table. Is there any way to do this to reduce time to insert data from file to database without looping? Please help me Python Version : 2.6

Database : postgres
 table: keys_data

 File Structure
1,ED2,'FDFDFDFDF','NULL'
2,ED2,'SDFSDFDF','NULL

Postgres can read CSV directly into a table with the COPY command . This either requires you to be able to place files directly on the Postgres server, or data can be piped over a connection with COPY FROM STDIN .

The \\copy command in Postgres' psql command-line client will read a file locally and insert using COPY FROM STDIN so that's probably the easiest (and still fastest) way to do this.

Note: this doesn't require any use of Python, it's native functionality in Postgres and not all or most other RDBs have the same functionality.

I've performed similar task, the only exception is that my solution is python 3.x based. I am sure you can find equivalent code of this solution. Code is pretty self explanatory.

from sqlalchemy import create_engine

def insert_in_postgre(table_name, df):

    #create engine object
    engine = create_engine('postgresql+psycopg2://user:password@hostname/database_name')

    #push dataframe in given database engine
    df.head(0).to_sql(table_name, engine, if_exists='replace',index=False ) 
    conn = engine.raw_connection()
    cur = conn.cursor()
    output = io.StringIO()
    df.to_csv(output, sep='\t', header=False, index=False)
    output.seek(0)
    contents = output.getvalue()
    cur.copy_from(output, table_name, null="") 
    conn.commit()
    cur.close()

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM