簡體   English   中英

氣流+ Kubernetes群集+ Virtualbox:調度程序錯誤“數據庫連接無效”。

[英]Airflow + Kubernetes cluster + Virtualbox : Scheduler error “DB connection invalidated.”

我正在嘗試按照本指南https://kubernetes.io/blog/2018/06/28/airflow-on-kubernetes-part-1-a-different-kind-of-operator/和官方存儲庫測試Airflow KubernetesPodOperator https://github.com/apache/incubator-airflow.git

我能夠部署Kubernetes集群(./scripts/ci/kubernetes/kube/deploy.sh -d持久性模式),但是似乎在調度程序和postgres容器之間存在問題。 Scheduler無法從日志成功連接到postgres:

    $ kubectl logs airflow-698ff6b8cd-gdr7f scheduler
`[2019-02-24 21:06:20,529] {settings.py:175} INFO - settings.configure_orm(): Usi
ng pool settings. pool_size=5, pool_recycle=1800, pid=1
[2019-02-24 21:06:20,830] {__init__.py:51} INFO - Using executor LocalExecutor
  ____________       _____________
 ____    |__( )_________  __/__  /________      __
____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
 _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
[2019-02-24 21:06:21,317] {jobs.py:1490} INFO - Starting the scheduler
[2019-02-24 21:06:21,317] {jobs.py:1498} INFO - Processing each file at most -1 times
[2019-02-24 21:06:21,317] {jobs.py:1501} INFO - Searching for files in /root/airflow/dags
[2019-02-24 21:06:21,547] {jobs.py:1503} INFO - There are 22 files in /root/airflow/dags
[2019-02-24 21:06:21,688] {jobs.py:1548} INFO - Resetting orphaned tasks for active dag runs
[2019-02-24 21:06:22,059] {dag_processing.py:514} INFO - Launched DagFileProcessorManager with pid: 39
[2019-02-24 21:06:22,183] {settings.py:51} INFO - Configured default timezone <Timezone [UTC]>
[2019-02-24 21:06:22,200] {settings.py:175} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=39
[2019-02-24 21:06:53,375] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:04,396] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:15,418] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:26,448] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:37,458] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...
[2019-02-24 21:07:48,472] {sqlalchemy.py:81} WARNING - DB connection invalidated. Reconnecting...`

此處的yaml文件:airflow.yaml

apiVersion: rbac.authorization.k8s.io/v1beta1
kind: ClusterRoleBinding
metadata:
  name: admin-rbac
subjects:
  - kind: ServiceAccount
    # Reference to upper's `metadata.name`
    name: default
    # Reference to upper's `metadata.namespace`
    namespace: default
roleRef:
  kind: ClusterRole
  name: cluster-admin
  apiGroup: rbac.authorization.k8s.io
---
apiVersion: extensions/v1beta1
kind: Deployment
metadata:
  name: airflow
spec:
  replicas: 1
  template:
    metadata:
      labels:
        name: airflow
    spec:
      initContainers:
      - name: "init"
        image: airflow:latest
        imagePullPolicy: IfNotPresent
        volumeMounts:
        - name: airflow-configmap
          mountPath: /root/airflow/airflow.cfg
          subPath: airflow.cfg
        - name: airflow-dags
          mountPath: /root/airflow/dags
        - name: test-volume
          mountPath: /root/test_volume
        env:
        - name: SQL_ALCHEMY_CONN
          valueFrom:
            secretKeyRef:
              name: airflow-secrets
              key: sql_alchemy_conn
        command:
          - "bash"
        args:
          - "-cx"
          - "./tmp/airflow-test-env-init.sh"

      containers:
      - name: webserver
        image: airflow:latest
        imagePullPolicy: IfNotPresent
        ports:
        - name: webserver
          containerPort: 8080
        args: ["webserver"]
        env:
        - name: AIRFLOW_KUBE_NAMESPACE
          valueFrom:
            fieldRef:
              fieldPath: metadata.namespace
        - name: SQL_ALCHEMY_CONN
          valueFrom:
            secretKeyRef:
              name: airflow-secrets
              key: sql_alchemy_conn
        volumeMounts:
        - name: airflow-configmap
          mountPath: /root/airflow/airflow.cfg
          subPath: airflow.cfg
        - name: airflow-dags
          mountPath: /root/airflow/dags
        - name: airflow-logs
          mountPath: /root/airflow/logs
      - name: scheduler
        image: airflow:latest
        imagePullPolicy: IfNotPresent
        args: ["scheduler"]
        env:
        - name: AIRFLOW_KUBE_NAMESPACE
          valueFrom:
            fieldRef:
              fieldPath: metadata.namespace
        - name: SQL_ALCHEMY_CONN
          valueFrom:
            secretKeyRef:
              name: airflow-secrets
              key: sql_alchemy_conn
        volumeMounts:
        - name: airflow-configmap
          mountPath: /root/airflow/airflow.cfg
          subPath: airflow.cfg
        - name: airflow-dags
          mountPath: /root/airflow/dags
        - name: airflow-logs
          mountPath: /root/airflow/logs
      volumes:
      - name: airflow-dags
        persistentVolumeClaim:
          claimName: airflow-dags
      - name: airflow-dags-fake
        emptyDir: {}
      - name: airflow-dags-git
        emptyDir: {}
      - name: test-volume
        persistentVolumeClaim:
          claimName: test-volume
      - name: airflow-logs
        persistentVolumeClaim:
          claimName: airflow-logs
      - name: airflow-configmap
        configMap:
          name: airflow-configmap
---
apiVersion: v1
kind: Service
metadata:
  name: airflow
spec:
  type: NodePort
  ports:
    - port: 8080
      nodePort: 30809
  selector:
    name: airflow

postgres.yaml:

kind: Deployment
apiVersion: extensions/v1beta1
metadata:
  name: postgres-airflow
spec:
  replicas: 1
  template:
    metadata:
      labels:
        name: postgres-airflow
    spec:
      restartPolicy: Always
      containers:
        - name: postgres
          image: postgres
          imagePullPolicy: IfNotPresent
          ports:
            - containerPort: 5432
              protocol: TCP
          volumeMounts:
            - name: dbvol
              mountPath: /var/lib/postgresql/data/pgdata
              subPath: pgdata
          env:
            - name: POSTGRES_USER
              value: root
            - name: POSTGRES_PASSWORD
              value: XXXX
            - name: POSTGRES_DB
              value: airflow
            - name: PGDATA
              value: /var/lib/postgresql/data/pgdata
            - name: POD_IP
              valueFrom: { fieldRef: { fieldPath: status.podIP } }
          livenessProbe:
            initialDelaySeconds: 60
            timeoutSeconds: 5
            failureThreshold: 5
            exec:
              command:
              - /bin/sh
              - -c
              - exec pg_isready --host $POD_IP ||  if [[ $(psql -qtAc --host $POD_IP 'SELECT pg_is_in_recovery') != "f" ]]; then  exit 0 else; exit 1; fi
          readinessProbe:
            initialDelaySeconds: 5
            timeoutSeconds: 5
            periodSeconds: 5
            exec:
              command:
              - /bin/sh
              - -c
              - exec pg_isready --host $POD_IP
          resources:
            requests:
              memory: .5Gi
              cpu: .5
      volumes:
        - name: dbvol
          emptyDir: {}
---
apiVersion: v1
kind: Service
metadata:
  name: postgres-airflow
spec:
  clusterIP: None
  ports:
    - port: 5432
      targetPort: 5432
  selector:
    name: postgres-airflow

我是Kubernetes的新手,我將不勝感激!

編輯:我能夠檢查initContainer日志,似乎吊艙之間建立了連接。 另外,如果我打開Airflow Web UI,則看不到“最近的任務”,“達格運行”,既看不到dag的圖形視圖也不顯示樹視圖,而只能看到加載的圓形圖像。

編輯:非常感謝您的幫助,我發現來自Web服務器的一些錯誤響應,例如static / dist / net :: ERR_ABORTED 404(NOT FOUND),所以我認為Docker鏡像構建未成功完成。 我沒有在./scripts/ci/kubernetes/docker/compile.sh上使用python setup.py compile_assets sdist -q進行構建,而是添加了RUN pip install --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org apache-airflow[celery,kubernetes,postgres,rabbitmq,ssh]我的Dockerfile上的RUN pip install --trusted-host pypi.org --trusted-host files.pythonhosted.org --trusted-host pypi.python.org apache-airflow[celery,kubernetes,postgres,rabbitmq,ssh]

我認為這是因為您使用的是:LocalExecutor而不是KubernetesExecutor。

我只是遵循相同的教程,無法重現您的問題:

[2019-03-01 16:07:22,053] {__init__.py:51} INFO - Using executor KubernetesExecutor
  ____________       _____________
 ____    |__( )_________  __/__  /________      __
____  /| |_  /__  ___/_  /_ __  /_  __ \_ | /| / /
___  ___ |  / _  /   _  __/ _  / / /_/ /_ |/ |/ /
 _/_/  |_/_/  /_/    /_/    /_/  \____/____/|__/
[2019-03-01 16:07:29,564] {security.py:458} INFO - Start syncing user roles.
[2019-03-01 16:07:30,317] {security.py:196} INFO - Existing permissions for the role:Viewer within the database will persist.
[2019-03-01 16:07:31,281] {security.py:196} INFO - Existing permissions for the role:User within the database will persist.
[2019-03-01 16:07:31,709] {security.py:196} INFO - Existing permissions for the role:Op within the database will persist.
[2019-03-01 16:07:31,717] {security.py:374} INFO - Fetching a set of all permission, view_menu from FAB meta-table
[2019-03-01 16:07:33,357] {security.py:324} INFO - Cleaning faulty perms
[2019-03-01 16:07:37,878] {settings.py:175} INFO - settings.configure_orm(): Using pool settings. pool_size=5, pool_recycle=1800, pid=10
[2019-03-01 16:07:38 +0000] [10] [INFO] Starting gunicorn 19.9.0
[2019-03-01 16:07:38 +0000] [10] [INFO] Listening at: http://0.0.0.0:8080 (10)
[2019-03-01 16:07:38 +0000] [10] [INFO] Using worker: sync
[2019-03-01 16:07:38 +0000] [15] [INFO] Booting worker with pid: 15
[2019-03-01 16:07:38 +0000] [16] [INFO] Booting worker with pid: 16
[2019-03-01 16:07:38 +0000] [17] [INFO] Booting worker with pid: 17
[2019-03-01 16:07:38 +0000] [18] [INFO] Booting worker with pid: 18
Running the Gunicorn Server with:
Workers: 4 sync
Host: 0.0.0.0:8080
Timeout: 120
Logfiles: - -
=================================================================
[2019-03-01 16:07:57,078] {__init__.py:51} INFO - Using executor KubernetesExecutor
[2019-03-01 16:07:57,479] {__init__.py:51} INFO - Using executor KubernetesExecutor
[2019-03-01 16:07:57,490] {__init__.py:51} INFO - Using executor KubernetesExecutor
[2019-03-01 16:07:57,853] {__init__.py:51} INFO - Using executor KubernetesExecutor
[2019-03-01 16:08:02,411] {__init__.py:298} INFO - Filling up the DagBag from /root/airflow/dags
[2019-03-01 16:08:02,423] {__init__.py:298} INFO - Filling up the DagBag from /root/airflow/dags
[2019-03-01 16:08:02,420] {__init__.py:298} INFO - Filling up the DagBag from /root/airflow/dags
[2019-03-01 16:08:02,424] {__init__.py:298} INFO - Filling up the DagBag from /root/airflow/dags
[2019-03-01 16:08:08,480] {security.py:458} INFO - Start syncing user roles.
[2019-03-01 16:08:08,525] {security.py:458} INFO - Start syncing user roles.
[2019-03-01 16:08:09,250] {security.py:196} INFO - Existing permissions for the role:Viewer within the database will persist.

我的設置: minikube:v0.32.0
的VirtualBox:5.2.22
Apache / Airflow構建自:b51712c

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM