简体   繁体   中英

Django-like unit testing database in plain python

I have a python project that uses Postgresql. I would like to use django-like unit tests where the database is created and destroyed at every test. However, I don't want to use sqlalchemy. I tried something along these lines:

pg = psycopg2.connect(
"host={} dbname={} user={} password={}".format(
POSTGRES_HOST, 'postgres', POSTGRES_USER, POSTGRES_PASSWORD))

pg.set_isolation_level(ISOLATION_LEVEL_AUTOCOMMIT)
cur = pg.cursor()




def reset_db():
    cur.execute('DROP DATABASE IF EXISTS {} '.format(POSTGRES_DB))
    cur.execute('CREATE DATABASE {}'.format(POSTGRES_DB))

    newconn = psycopg2.connect(
    "host={} dbname={} user={} password={}".format(
    POSTGRES_HOST, POSTGRES_DB, POSTGRES_USER, POSTGRES_PASSWORD))

    newcur =  newconn.cursor()

    # SCHEMAS is an imported dict containing schema creation instructions
    for schema in SCHEMAS:
        newcur.execute(SCHEMAS[schema])

    return newcur

class Test(TestCase):
    def setUp(self):
        os.environ['testing'] = 'true'
        self.cur = reset_db()

Then the setUp method sets an environmental variable that informs my database layer to use the testing db.

This seems to work fine. The only problem is the reset_db() takes about 0.8 seconds, which is far too much.

Are there better approaches or ways to optimise my approach?

Recreating DB after each test case is quite expensive operation.

Perhaps, creating your DB once at the beginning and then only deleting all data from all tables after each test case would be one of possible solutions ?

您可以尝试Django所做的工作-在每次测试之前启动事务,然后在测试之后回滚,撤消测试期间所做的任何数据库更改。

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM