简体   繁体   中英

Python PostgreSQL table creation using pg_dump

I'm trying to replicate the table structure of an existing PostgreSQL database. I have 2 databases currently, A and B. I used the "pg_dump -s" command and generated the sql query to create the tables of A. In my python script, I have the following code to create the same structure in database B:

con2 = psycopg2.connect(database="Archives", user="postgres", password="root", host="127.0.0.1", port="5432")

dbexe = """
--
-- PostgreSQL database dump
--

-- Dumped from database version 9.6.0
-- Dumped by pg_dump version 9.6.0

SET statement_timeout = 0;
SET lock_timeout = 0;
SET idle_in_transaction_session_timeout = 0;
SET client_encoding = 'UTF8';
SET standard_conforming_strings = on;
SET check_function_bodies = false;
SET client_min_messages = warning;
SET row_security = off;

--
-- Name: plpgsql; Type: EXTENSION; Schema: -; Owner: 
--

CREATE EXTENSION IF NOT EXISTS plpgsql WITH SCHEMA pg_catalog;


--
-- Name: EXTENSION plpgsql; Type: COMMENT; Schema: -; Owner: 
--
several thousand more lines ...."""

cur2 = con2.cursor()
cur2.execute(dbexe)
print("done")

The code seems to run fine but when I check database BI don't see any tables. Any suggestions as to why this may be happening?

You must commit the transaction at the end because by default any data manipulation is rolled back and lost.

cur.commit()

Alternatively, you can set autocommit to True at the beginning to commit each new data automatically to the db.

conn.autocommit = True

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM