简体   繁体   中英

Airflow + Kubernetes cluster + Virtualbox : Scheduler error “DB connection invalidated.”

I'm trying to test Airflow KubernetesPodOperator following this guide https://kubernetes.io/blog/2018/06/28/airflow-on-kubernetes-part-1-a-different-kind-of-operator/ and official repository https://github.com/apache/incubator-airflow.git .

I was able to deploy Kubernetes cluster (./scripts/ci/kubernetes/kube/deploy.sh -d persistent_mode) , but seems there is an issue between scheduler and postgres container . Scheduler is not able to successfully connect to postgres , from logs :

    $ kubectl logs airflow-698ff6b8cd-gdr7f scheduler
`[2019-02-24 21:06:20,529] {settings.py:175} INFO - settings.configure_orm(): Usi
ng pool settings. pool_size=5, pool_recycle=1800, pid=1
[2019-02-24 21:06:20,830] {__init__.py:51} INFO - Using executor LocalExecutor
  ____________       _____________
 ____    |__( )_________  __/__  /________      __
____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
 _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
[2019-02-24 21:06:21,317] {jobs.py:1490} INFO - Starting the scheduler
[2019-02-24 21:06:21,317] {jobs.py:1498} INFO - Processing each file at most -1 times
[2019-02-24 21:06:21,317] {jobs.py:1501} INFO - Searching for files in /root/airflow/dags
[2019-02-24 21:06:21,547] {jobs.py:1503} INFO - There are 22 files in /root/airflow/dags
[2019-02-24 21:06:21,688] {jobs.py:1548} INFO - Resetting orphaned tasks for active dag runs
[2019-02-24 21:06:22,059] {dag_processing.py:514} INFO - Launched DagFileProcessorManager with pid: 39
[2019-02-24 21:06:22,183] {settings.py:51} INFO - Configured default timezone <Timezone [UTC]>
[2019-02-24 21:06:22,200] {settings.py:175} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=39
[2019-02-24 21:06:53,375] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:04,396] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:15,418] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:26,448] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:37,458] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:48,472] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...`

Here yaml files : airflow.yaml

apiVersion: rbac.authorization.k8s.io/v1beta1
kind: ClusterRoleBinding
metadata:
  name: admin-rbac
subjects:
  - kind: ServiceAccount
    # Reference to upper's `metadata.name`
    name: default
    # Reference to upper's `metadata.namespace`
    namespace: default
roleRef:
  kind: ClusterRole
  name: cluster-admin
  apiGroup: rbac.authorization.k8s.io
---
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
  name: airflow
spec:
  replicas: 1
  template:
    metadata:
      labels:
        name: airflow
    spec:
      initContainers:
      - name: "init"
        image: airflow:latest
        imagePullPolicy: IfNotPresent
        volumeMounts:
        - name: airflow-configmap
          mountPath: /root/airflow/airflow.cfg
          subPath: airflow.cfg
        - name: airflow-dags
          mountPath: /root/airflow/dags
        - name: test-volume
          mountPath: /root/test_volume
        env:
        - name: SQL_ALCHEMY_CONN
          valueFrom:
            secretKeyRef:
              name: airflow-secrets
              key: sql_alchemy_conn
        command:
          - "bash"
        args:
          - "-cx"
          - "./tmp/airflow-test-env-init.sh"

      containers:
      - name: webserver
        image: airflow:latest
        imagePullPolicy: IfNotPresent
        ports:
        - name: webserver
          containerPort: 8080
        args: ["webserver"]
        env:
        - name: AIRFLOW_KUBE_NAMESPACE
          valueFrom:
            fieldRef:
              fieldPath: metadata.namespace
        - name: SQL_ALCHEMY_CONN
          valueFrom:
            secretKeyRef:
              name: airflow-secrets
              key: sql_alchemy_conn
        volumeMounts:
        - name: airflow-configmap
          mountPath: /root/airflow/airflow.cfg
          subPath: airflow.cfg
        - name: airflow-dags
          mountPath: /root/airflow/dags
        - name: airflow-logs
          mountPath: /root/airflow/logs
      - name: scheduler
        image: airflow:latest
        imagePullPolicy: IfNotPresent
        args: ["scheduler"]
        env:
        - name: AIRFLOW_KUBE_NAMESPACE
          valueFrom:
            fieldRef:
              fieldPath: metadata.namespace
        - name: SQL_ALCHEMY_CONN
          valueFrom:
            secretKeyRef:
              name: airflow-secrets
              key: sql_alchemy_conn
        volumeMounts:
        - name: airflow-configmap
          mountPath: /root/airflow/airflow.cfg
          subPath: airflow.cfg
        - name: airflow-dags
          mountPath: /root/airflow/dags
        - name: airflow-logs
          mountPath: /root/airflow/logs
      volumes:
      - name: airflow-dags
        persistentVolumeClaim:
          claimName: airflow-dags
      - name: airflow-dags-fake
        emptyDir: {}
      - name: airflow-dags-git
        emptyDir: {}
      - name: test-volume
        persistentVolumeClaim:
          claimName: test-volume
      - name: airflow-logs
        persistentVolumeClaim:
          claimName: airflow-logs
      - name: airflow-configmap
        configMap:
          name: airflow-configmap
---
apiVersion: v1
kind: Service
metadata:
  name: airflow
spec:
  type: NodePort
  ports:
    - port: 8080
      nodePort: 30809
  selector:
    name: airflow

postgres.yaml :

kind: Deployment
apiVersion: extensions/v1beta1
metadata:
  name: postgres-airflow
spec:
  replicas: 1
  template:
    metadata:
      labels:
        name: postgres-airflow
    spec:
      restartPolicy: Always
      containers:
        - name: postgres
          image: postgres
          imagePullPolicy: IfNotPresent
          ports:
            - containerPort: 5432
              protocol: TCP
          volumeMounts:
            - name: dbvol
              mountPath: /var/lib/postgresql/data/pgdata
              subPath: pgdata
          env:
            - name: POSTGRES_USER
              value: root
            - name: POSTGRES_PASSWORD
              value: XXXX
            - name: POSTGRES_DB
              value: airflow
            - name: PGDATA
              value: /var/lib/postgresql/data/pgdata
            - name: POD_IP
              valueFrom: { fieldRef: { fieldPath: status.podIP } }
          livenessProbe:
            initialDelaySeconds: 60
            timeoutSeconds: 5
            failureThreshold: 5
            exec:
              command:
              - /bin/sh
              - -c
              - exec pg_isready --host $POD_IP ||  if [[ $(psql -qtAc --host $POD_IP 'SELECT pg_is_in_recovery') != "f" ]]; then  exit 0 else; exit 1; fi
          readinessProbe:
            initialDelaySeconds: 5
            timeoutSeconds: 5
            periodSeconds: 5
            exec:
              command:
              - /bin/sh
              - -c
              - exec pg_isready --host $POD_IP
          resources:
            requests:
              memory: .5Gi
              cpu: .5
      volumes:
        - name: dbvol
          emptyDir: {}
---
apiVersion: v1
kind: Service
metadata:
  name: postgres-airflow
spec:
  clusterIP: None
  ports:
    - port: 5432
      targetPort: 5432
  selector:
    name: postgres-airflow

I'm very new to Kubernetes, I would appreciate any help!

EDIT: I was able to check initContainer logs, and seems like connection between pods are established. Also if I open Airflow web UI, I cannot see "Recent Tasks", "Dag Runs" neither graph or tree view of dags, just a loading circle image.

EDIT: Many Thanks for your help , I found several bad responses from webserver like static/dist/ net::ERR_ABORTED 404 (NOT FOUND) , so I supposed that docker image build didn't finished successfully . Instead of building with python setup.py compile_assets sdist -q on ./scripts/ci/kubernetes/docker/compile.sh , I added RUN pip install --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org apache-airflow[celery,kubernetes,postgres,rabbitmq,ssh] on my Dockerfile .

I think this is because you are using: LocalExecutor instead of KubernetesExecutor.

I just followed the same tutorial and couldn`t reproduce you issue:

[2019-03-01 16:07:22,053] {__init__.py:51} INFO - Using executor KubernetesExecutor
  ____________       _____________
 ____    |__( )_________  __/__  /________      __
____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
 _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
[2019-03-01 16:07:29,564] {security.py:458} INFO - Start syncing user roles.
[2019-03-01 16:07:30,317] {security.py:196} INFO - Existing permissions for the role:Viewer within the database will persist.
[2019-03-01 16:07:31,281] {security.py:196} INFO - Existing permissions for the role:User within the database will persist.
[2019-03-01 16:07:31,709] {security.py:196} INFO - Existing permissions for the role:Op within the database will persist.
[2019-03-01 16:07:31,717] {security.py:374} INFO - Fetching a set of all permission, view_menu from FAB meta-table
[2019-03-01 16:07:33,357] {security.py:324} INFO - Cleaning faulty perms
[2019-03-01 16:07:37,878] {settings.py:175} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=10
[2019-03-01 16:07:38 +0000] [10] [INFO] Starting gunicorn 19.9.0
[2019-03-01 16:07:38 +0000] [10] [INFO] Listening at: http://0.0.0.0:8080 (10)
[2019-03-01 16:07:38 +0000] [10] [INFO] Using worker: sync
[2019-03-01 16:07:38 +0000] [15] [INFO] Booting worker with pid: 15
[2019-03-01 16:07:38 +0000] [16] [INFO] Booting worker with pid: 16
[2019-03-01 16:07:38 +0000] [17] [INFO] Booting worker with pid: 17
[2019-03-01 16:07:38 +0000] [18] [INFO] Booting worker with pid: 18
Running the Gunicorn Server with:
Workers: 4 sync
Host: 0.0.0.0:8080
Timeout: 120
Logfiles: - -
=================================================================
[2019-03-01 16:07:57,078] {__init__.py:51} INFO - Using executor KubernetesExecutor
[2019-03-01 16:07:57,479] {__init__.py:51} INFO - Using executor KubernetesExecutor
[2019-03-01 16:07:57,490] {__init__.py:51} INFO - Using executor KubernetesExecutor
[2019-03-01 16:07:57,853] {__init__.py:51} INFO - Using executor KubernetesExecutor
[2019-03-01 16:08:02,411] {__init__.py:298} INFO - Filling up the DagBag from /root/airflow/dags
[2019-03-01 16:08:02,423] {__init__.py:298} INFO - Filling up the DagBag from /root/airflow/dags
[2019-03-01 16:08:02,420] {__init__.py:298} INFO - Filling up the DagBag from /root/airflow/dags
[2019-03-01 16:08:02,424] {__init__.py:298} INFO - Filling up the DagBag from /root/airflow/dags
[2019-03-01 16:08:08,480] {security.py:458} INFO - Start syncing user roles.
[2019-03-01 16:08:08,525] {security.py:458} INFO - Start syncing user roles.
[2019-03-01 16:08:09,250] {security.py:196} INFO - Existing permissions for the role:Viewer within the database will persist.

My setup: minikube: v0.32.0
VirtualBox: 5.2.22
Apache/Airflow build from: b51712c

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM