简体   繁体   中英

Github Actions to Connect Postgres service with custom container image

In my Django project, I have a CI workflow for running tests, which requires a Postgres service. Recently a new app introduced heavier packages such as pandas, matplotlib, pytorch and so on and this increased the run-tests job time from 2 to 12 minutes which is absurd. Also in my project, I have a base Docker image with Python and these packages that are heavier to speed up the build of the images. So I was thinking to use this same image in the workflow when running the steps because the packages would be loaded already.

Unfortunately, all goes well until it reaches the step to actually run the tests because it seems that the postgres service is not connected with the container and I get the following error:

psycopg2.OperationalError: could not connect to server: Connection refused
    Is the server running on host "localhost" (127.0.0.1) and accepting
    TCP/IP connections on port 5432?

This is my workflow right now. Any ideas on what I am doing wrong?

name: server-ci

on:
  pull_request:
      types: [opened]

env:
  DJANGO_SETTINGS_MODULE: settings_test

jobs:

  run-tests:
    name: Run tests

    runs-on: ubuntu-latest

    container:
      image: myimage/django-server:base
      credentials:
        username: ${{ secrets.DOCKERHUB_USERNAME }}
        password: ${{ secrets.DOCKERHUB_PASSWORD }}
      ports:
        - 8000:8000

    services:
      postgres:
        image: postgres
        env:
          POSTGRES_USER: postgres
          POSTGRES_PASSWORD: admin
          POSTGRES_DB: mydb
        ports:
          - 5432:5432
        options: --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5

    env:
      POSTGRES_HOST: localhost
      POSTGRES_PORT: 5432
      POSTGRES_PASSWORD: admin
      POSTGRES_USER: postgres

    steps:
      - name: Checkout repository
        uses: actions/checkout@v2

      - name: Cache dependencies
        uses: actions/cache@v2
        with:
          path: /opt/venv
          key: /opt/venv-${{ hashFiles('**/requirements.txt') }}

      - name: Install dependencies
        run: |
          python -m pip install --upgrade pip
          python -m pip install -r requirements.txt

        if: steps.cache.outputs.cache-hit != 'true'
      - name: Run tests
        run: |
          ./manage.py test --parallel --verbosity=2

It turns out that the workflow is now running in a container of its own, next to the postgres container. So the port mapping to the runner VM doesn't do anything any more (because it affects the host, not Docker containers on it).

The job and service containers get attached to the same Docker network, so all I need to do is change POSTGRES_HOST to postgres (the name of the service container) and Docker's DNS should do the rest.

Credits: https://github.community/t/connect-postgres-service-with-custom-container-image/189994/2?u=everspader

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM