简体   繁体   English

将数据库中的所有表导出到python中的单个csv文件中

[英]Export all the tables in the database into a single csv file in python

I'm been trying to export all the tables in my database into a single csv file. 我一直在尝试将数据库中的所有表导出到单个csv文件中。

I've tried 我试过了

import MySQLdb as dbapi
import sys
import csv
import time


dbname      = 'site-local'
user        = 'root'
host        = '127.0.0.1'
password    = ''
date        = time.strftime("%d-%m-%Y")
file_name   = date+'-portal'

query='SELECT * FROM site-local;' //<---- I'm stuck here

db=dbapi.connect(host=host,user=user,passwd=password)
cur=db.cursor()
cur.execute(query)
result=cur.fetchall()

c = csv.writer(open(file_name+'.csv','wb'))
c.writerow(result)

I'm a little stuck now, I hope someone can sheds some light base on what I have. 我现在有些困惑,希望有人能为我的知识打下基础。

Using python isn't the way to go about this. 使用python并不是解决这个问题的方法。 The simplest solution is to use SELECT INTO OUTFILE . 最简单的解决方案是使用SELECT INTO OUTFILE That can dump even very large tables into CSV format speedily and without the need to bother with CSV writers in your python code. 这样可以将非常大的表快速转储为CSV格式,而无需在python代码中打扰CSV编写器。

From your other questions, I understand that the reason for your dumping is to re-import into postgresql. 从您的其他问题中,我知道您进行转储的原因是重新导入到postgresql中。 If that was not so, you could have simply used mysqldump command to dump out the entire database at once. 如果不是这样,则可以只使用mysqldump命令立即转储整个数据库。

If you want to dump each table in CSV format, it does call for a bit of code. 如果要以CSV格式转储每个表,则确实需要一些代码。 Create a python loop to iterate through all the tables and then execute a SELECT INTO query on each of those tables. 创建一个python循环以遍历所有表,然后对每个表执行SELECT INTO查询。

Consider iteratively exporting the SHOW CREATE TABLE (txt files) and SELECT * FROM (csv files) output from all database tables. 考虑以迭代方式导出所有数据库表的SHOW CREATE TABLE (txt文件)和SELECT * FROM (csv文件)输出。 From your related earlier questions, since you need to migrate databases, you can then run the create table statements (adjusting the MySQL for Postgre syntax such as the ENGINE=InnoDB lines) and then import the data via csv using PostgreSQL's COPY command. 从先前的相关问题中,由于需要迁移数据库,因此您可以运行create table语句(将MySQL调整为Postgre语法,例如ENGINE=InnoDB行),然后使用PostgreSQL的COPY命令通过csv导入数据。 Below csv files include table column headers not included in fetchall() . 下面的csv文件包含fetchall()未包含的表列标题。

db = dbapi.connect(host=host,user=user,passwd=password)
cur = db.cursor()

# RETRIEVE TABLES
cur.execute("SHOW TABLES")
tables = []
for row in cur.fetchall():
    tables.append(row[0])

for t in tables:    
    # CREATE TABLES STATEMENTS
    cur.execute("SHOW CREATE TABLE `{}`".format(t))
    temptxt = '{}_table.txt'.format(t)

    with open(temptxt, 'w', newline='') as txtfile:
        txtfile.write(cur.fetchone()[1])                   # ONE RECORD FETCH
    txtfile.close()

    # SELECT STATEMENTS
    cur.execute("SELECT * FROM `{}`".format(t))
    tempcsv = '{}_data.csv'.format(t)

    with open(tempcsv, 'w', newline='') as csvfile:
        writer = csv.writer(csvfile)
        writer.writerow([i[0] for i in cur.description])   # COLUMN HEADERS
        for row in cur.fetchall():        
            writer.writerow(row)
    csvfile.close()

cur.close()
db.close()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM