简体   繁体   中英

Connect to Postgres as SystemD service from docker-compose

I need to connect to a Postgres server instance that is running as a SystemD service from within a docker-compose file.

docker-compose containers ---> postgres as systemd

This is about setting up Airflow with an external Postgres DB that is on localhost .

I've taken the docker-compose example with:

curl -LfO 'https://airflow.apache.org/docs/apache-airflow/2.2.3/docker-compose.yaml'

However in there they are defining a Postgres container where Airflow connects to by resolving the postgres host within the Docker network.

But I already have Postgres running on the machine via SystemD, I can check its status with:

# make sure the service is up and running
systemctl list-units --type=service | grep postgres.*12
# check the process
ps aux | grep postgres.*12.*config_file
# check the service details
systemctl status postgresql@12-main.service

AFAIU inside the docker-compose YAML file I need to use the feature host.docker.internal so the Docker service makes the docker container find their way out of the Docker network and find localhost with the SystemD services eg Postgres.

I've setup the Airflow YAML file for docker-compose with:

---
version: '3'
x-airflow-common:
  &airflow-common
  image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.2.3}
  environment:
    &airflow-common-env
    AIRFLOW__CORE__EXECUTOR: LocalExecutor
    AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@host.docker.internal/airflow
    ...
  extra_hosts:
    - "host.docker.internal:host-gateway"

There's a lot of stuff going on there, but the point is that the SQLAlchemy connection string is using host.docker.internal as host.

Now when I invoke the command docker-compose -f airflow-local.yaml up airflow-init , then I see in the ouput logs that Airflow is complaining it does not find the Postgres server:

airflow-init_1       | psycopg2.OperationalError: connection to server at "host.docker.internal" (172.17.0.1), port 5432 failed: Connection refused
airflow-init_1       |  Is the server running on that host and accepting TCP/IP connections?

It might be an issue with the DNS resolution between the Docker special network and the OS network, not sure how to troubleshoot this.

How do I make Docker container to find out SystemD services that serve on localhost ?

Turns out I just need to use network_mode: host in the YAML code for the Docker container definition (a container is a service in docker-compose terminology).

This way the Docker virtual network is somehow bound to the laptop networking layer ("localhost" or "127.0.0.1"). This setup is not encouraged by the Docker people, but sometimes things are messy when dealing with legacy systems so you have to work around what has been done in the past.

Then you can use localhost to reach the Postgres DB running as a SystemD service.

The only caveat is that someone can not use port mappings when using network_mode: host , otherwise docker-compose complains with the error message:

"host" network_mode is incompatible with port_bindings

So you have to remove the YAML part similar to:

ports:
  - 9999:8080

and sort out the ports (TCP sockets) in a different way.

In my specific scenario (Airflow stuff), I've done the following:

For the host/networking that makes the Airflow webserver (docker container/service) reach the Postgres DB (SystemD service/daemon) on localhost :

# see the use of "localhost"
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@localhost/airflow

For the TCP port, in the docker-compose YAML service definition for the Airflow webserver I specified the port:

command: webserver -p 9999

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM