简体   繁体   中英

Docker-Compose Workflow, docker-compose down?

I am learning docker have a working docker-compose implementation with django/postgresql. Everything is working as expected. My question is what is considered "best practice" with data persistence and the risk to the data.

Here is my full docker-compose.yml:

version: '2'
services:
  db:
    image: postgres
    volumes:
      - postgresql:/var/lib/postgresql
    ports:
      - "5432:5432"
    env_file: .env
  web:
    build: .
    command: python run_server.py
    volumes:
      - .:/project
    ports:
      - "8000:8000"
    depends_on:
      - db
volumes:
  postgresql:

the run_server.py script simply checks to makes sure the database can be connected to and then runs python manage.py runserver .

So if I stop my containers and restart them the data persists. My concern lies in the docker-compose down command. This command deletes the database. Is this intended? It seems like it would be very easy to run this and accidentally do a lot of damage.

Is there a way so that the database persists even if these containers are removed?

I follow this guide for integrate Django with Docker.

https://docs.docker.com/compose/django/

The way for make the data "persist" is set the database outside the Docker image and let the APP connect to the database via the settings.py

With this trick, when the container is down, the dabase persist because is outside the same container.

Another trick is set the database inside another docker container

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM