简体   繁体   中英

Best practics working docker-compose on AWS ECS for Continuos Deploy

I'm new on ECS and and I'm somewhat confused about how to deploy in AWS ECS Fargate automatically with a docker-compose file with multiple services.

I was able to perform an End-to-End from a git push to the deploy of a single container , with the following steps:

  1. Create an AWS ECR
  2. Tag the docker image
  3. Create CodeCommit
  4. Create CodeBuild
  5. Create CodeDeploy
  6. Create a Cluster with a Task Description
  7. Create Pipeline to join everything before and automate until the end.
  8. Done

But what happens when you have multiple services?

  1. Do I have to modify the docker-compose file to be compatible with ECS? If so, how can I separate the repository if the entire project is in a folder (pydanny cookiecutter structure)?
  2. Do I have to create an ECR repository for each service of my docker-compose ?
  3. What are the steps to automate the tag and push of each ECR and then its respective deploy to achieve the complete End-to-End process?
  4. How can I modify the volumes of the docker-compose to work on ECS?

I use the following docker-compose file generated by the pydanny cookiecutter and it has 7 services:

Django + Postgres + Redis + Celery + Celeryworker + Celerybeat + Flower

docker-compose.yml

version: '3'

volumes:
  local_postgres_data: {}
  local_postgres_data_backups: {}

services:
  django: &django
    build:
      context: .
      dockerfile: ./compose/local/django/Dockerfile
    image: test_cd_django
    depends_on:
      - postgres
    volumes:
      - .:/app
    env_file:
      - ./.envs/.local/.django
      - ./.envs/.local/.postgres
    ports:
      - "8000:8000"
    command: /start

  postgres:
    build:
      context: .
      dockerfile: ./compose/production/postgres/Dockerfile
    image: test_cd_postgres
    volumes:
      - local_postgres_data:/var/lib/postgresql/data
      - local_postgres_data_backups:/backups
    env_file:
      - ./.envs/.local/.postgres

  redis:
    image: redis:3.2

  celeryworker:
    <<: *django
    image: test_cd_celeryworker
    depends_on:
      - redis
      - postgres

    ports: []
    command: /start-celeryworker

  celerybeat:
    <<: *django
    image: test_cd_celerybeat
    depends_on:
      - redis
      - postgres

    ports: []
    command: /start-celerybeat

  flower:
    <<: *django
    image: test_cd_flower
    ports:
      - "5555:5555"
    command: /start-flower

Thank you very much for any help.

it depends if you want to use your docker-compose to perform all the operations. If you want to build, push and pull using your docker-compose, you'll need to have the image blocks in the docker-compose.yml matching the ECR address. eg

image: ${ID}.dkr.ecr.${region}.amazonaws.com/${image_name}:${image_tag:-latest}

Do I have to create an ECR repository for each service of my docker-compose?

You don't have to create an ECR repository for each service but for each image you build. In your case, you don't have to create a repo for redis but you'll have to do it for django and postgres since you're building them using your Dockerfiles. celeryworker and celerybeat are using the django image to start so you won't need to create an extra repo for them.

What are the steps to automate the tag and push of each ECR and then its respective deploy to achieve the complete End-to-End process?

Here I can only provide some suggestions, it all depends on your setup. I tend to remain as cloud service agnostic as possible. You can have images in the docker-compose.yml defined as follow:

services:
  postgres:
    image: ${ID}.dkr.ecr.${region}.amazonaws.com/my_postgres:${image_tag:-latest}
  django:
    image: <theID>.dkr.ecr.<theRegion>.amazonaws.com/my_django:${image_tag:-latest}

and then simply prepare a .env file on the fly during the build containing the info you need. eg

image_tag=1.2.0

How can I modify the volumes of the docker-compose to work on ECS?

Unfortunately I can't answer this question and I found the following answer: https://devops.stackexchange.com/questions/6228/using-volumes-on-aws-fargate

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM