简体   繁体   中英

Running postgres 9.5 and django in one container for CI (Bamboo)

I am trying to configure a CI job on Bamboo for a Django app, the tests to be run rely on a database (postgres 9.5). It seems that a prudent way to go about is it run the whole test in a docker container, as I do not control the agent environment so I cannot install Postgres there.

Most guides I found recommend running postgres and django in two separate containers and using docker-compose to easily manage them. In this scenario each docker image runs just one service, started with CMD. In Bamboo I cannot use docker-compose however, I need to use just one image, so I am trying to get Postgres and Django to run nicely together in one container but with little success so far.

My problem is that I see no easy way to start Postgres as a service inside docker but NOT as a docker CMD command , official postgre image uses an entrypoint.sh approach, also described in the official docker docs But it is not clear to me how to implement that. I would appreciate your help!

Well, basically you would start postgres as a background process in the docker-entrypoint shell script that does otherwise start your django application.

The only trick here is that you need to put a 'trap' command in it so that you can send a shutdown/kill to the background process when your master process stops.

Although I have done that a thousand times, I know that it is a good source for programming errors. In general I do just use my docker-systemctl-replacement which takes care of running multiple applications as services, just as if the container is a virtual machine hosting multiple applications.

Your only other option is to add in a startup script in your Dockerfile , or kick it off as part of your docker run ... commands. We don't generally use the "Docker" tasks, as I find them ... distasteful (also why I usually just fall back to running a "Script" task, and directly calling docker run in that script task)

Anyway, you'd have to have your Docker container execute a script that would:

  1. Start up Postgres (like a sudo systemctl start postgresql )
  2. Execute your tests.

Your Dockerfile will have to install Postgresql and do some minor setup work I imagine (like create relevant users and databases with the proper owner). Since we're all good citizens, we remember to never run your containers as root, right?

Note - you can always hack around getting two containers to talk to each other without using docker-compose . It's a bit less convenient, but you could do something like:

docker run --detach --cidfile=db_cidfile --name ci_db postgresql_image
...
docker run --link ci_db testing_image

Make sure that you EXPOSE the right ports on the postgresql image to the testing_image container.

EDIT: I'm looking more at my specific case - we just install Postgresql into a base CentOS host rather than use the postgresql default image (using yum install http://yum.postgresql.org/..../pgdg-centos...rpm and then just install postgresql-server and postgresql-contrib packages from there). There is a CMD [ "/usr/pgsql-ver/bin/postgres", "-D", "/var/lib/pgsql/ver/data"] in our Dockerfile , too. We don't do anything fancy with the docker container, though. NOTE : we don't use this in production at all, this is strictly for local and CI testing.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM