繁体   English   中英

使用Kubernetes的Airflow GCP连接问题 - Fernet密钥必须是32个url-safe base64编码的字节

[英]Airflow GCP Connection Issue With Kubernetes - Fernet key must be 32 url-safe base64-encoded bytes

我目前正在使用Google Cloud GCP中的Kubernetes运行Airflow。 我将我的项目基于docker-airflow 我能够启动用户界面但是当我尝试为谷歌云创建连接并提交连接时,我收到以下错误。

  •  ValueError: Fernet key must be 32 url-safe base64-encoded bytes. 
  •  [2018-09-21 19:45:13,345] AirflowException: Could not create Fernet object: Fernet key must be 32 url-safe base64-encoded bytes. 

文档建议的第一个问题是确保安装了加密,我这样做。 我安装了两种类型,一种是气流,另一种是PyPi。

pip3 install apache-airflow[kubernetes,crypto] and also tried
pip install cryptography

我尝试运行生成和存储env变量的命令,如文档中所述,在此处找到。 (如下所示)

1) 手动生成一个fernet密钥并添加到airflow.cfg

2) 设置环境变量并重新启动服务器。

python -c "from cryptography.fernet import Fernet; 
print(Fernet.generate_key().decode())"

示例密钥: 81HqDtbqAywKSOumSha3BhWNOdQ26slT6K0YaZeZyPs=

使用kubernetes我无法使用关闭进程ID的典型方法重新启动服务器,因为它绑定到容器。 我还尝试将生成的密钥(上面)放在kubernetes集群的configmaps.yaml文件中(部署时等于airflow.cfg)。

我尝试通过DAG,通过UI运行GCP连接,并使用airflow命令行客户端手动运行。 这三种方法都返回了相同的错误。 我在这里提供了UI提交的图片以及完整的堆栈跟踪。

  • 为什么会发生这种情况? 是否没有生成fernet密钥? 它可能没有保存在基础卷上吗?*

谢谢您的帮助。

-RR

Traceback (most recent call last):

File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 159, in get_fernet
_fernet = Fernet(configuration.conf.get('core', 'FERNET_KEY').encode('utf-8'))
File "/usr/local/lib/python3.6/site-packages/cryptography/fernet.py", line 37, in __init__
"Fernet key must be 32 url-safe base64-encoded bytes."
ValueError: Fernet key must be 32 url-safe base64-encoded bytes.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1982, in wsgi_app
  response = self.full_dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1614, in full_dispatch_request
  rv = self.handle_user_exception(e)
    File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1517, in handle_user_exception
  reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.6/site-packages/flask/_compat.py", line 33, in reraise
  raise value
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1612, in full_dispatch_request
  rv = self.dispatch_request()
File "/usr/local/lib/python3.6/site-packages/flask/app.py", line 1598, in dispatch_request
  return self.view_functions[rule.endpoint](**req.view_args)
File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/security/decorators.py", line 26, in wraps
  return f(self, *args, **kwargs)
File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/views.py", line 524, in edit
  widgets = self._edit(pk)
File "/usr/local/lib/python3.6/site-packages/flask_appbuilder/baseviews.py", line 965, in _edit
  form.populate_obj(item)
File "/usr/local/lib/python3.6/site-packages/wtforms/form.py", line 96, in populate_obj
  field.populate_obj(obj, name)
File "/usr/local/lib/python3.6/site-packages/wtforms/fields/core.py", line 330, in populate_obj
  setattr(obj, name, self.data)
File "<string>", line 1, in __set__
File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 731, in set_extra
  fernet = get_fernet()
File "/usr/local/lib/python3.6/site-packages/airflow/models.py", line 163, in get_fernet
raise AirflowException("Could not create Fernet object: {}".format(ve))
airflow.exceptions.AirflowException: Could not create Fernet object: 
Fernet key must be 32 url-safe base64-encoded bytes.

这是底层持久卷的YAML。

kind: PersistentVolumeClaim
apiVersion: v1
metadata:
  name: airflow-dags
  namespace: data
spec:
  accessModes:
    - ReadOnlyMany
  storageClassName: standard    
  resources:
    requests:
      storage: 8Gi
---
kind: PersistentVolumeClaim
apiVersion: v1
metadata:
  name: airflow-logs
  namespace: data
spec:
  accessModes:
    - ReadOnlyMany
  storageClassName: standard    
  resources:
    requests:
      storage: 8Gi

这是YAML的气流配置。

apiVersion: apps/v1
kind: Deployment
metadata:
  name: airflow
  namespace: data
  labels:
    name: airflow
spec:
  replicas: 1
  selector:
    matchLabels:
      name: airflow
  template:
    metadata:
      labels:
        name: airflow
    spec:
      serviceAccountName: spark-service-account
      automountServiceAccountToken: true
      initContainers:
      - name: "init"
        image: <image_name>
        imagePullPolicy: Always
        volumeMounts:
        - name: airflow-configmap
          mountPath: /root/airflow/airflow.cfg
          subPath: airflow.cfg
        - name: airflow-dags
          mountPath: /root/airflow/dags
        # - name: test-volume
        #   mountPath: /root/test_volume
        env:
        - name: SQL_ALCHEMY_CONN
          valueFrom:
            secretKeyRef:
              name: airflow-secrets
              key: sql_alchemy_conn
        command:
          - "bash"
        args:
          - "-cx"
          - "airflow initdb || true && airflow create_user -u airflow -l airflow -f jon -e airflow@apache.org -r Admin -p airflow || true"
      containers:
      - name: webserver
        image: <image_name>
        imagePullPolicy: IfNotPresent
        ports:
        - name: webserver
          containerPort: 8080
        env:
        - name: <namespace_name>
          valueFrom:
            fieldRef:
              fieldPath: metadata.namespace
        - name: SQL_ALCHEMY_CONN
          valueFrom:
            secretKeyRef:
              name: airflow-secrets
              key: sql_alchemy_conn
        command: ["/bin/sh", "-c"]
        args: ["airflow webserver"]
        volumeMounts:
        - name: airflow-configmap
          mountPath: /root/airflow/airflow.cfg
          subPath: airflow.cfg
        - name: airflow-dags
          mountPath: /root/airflow/dags
        - name: airflow-logs
          mountPath: /root/airflow/logs
        # readinessProbe:
        #   initialDelaySeconds: 5
        #   timeoutSeconds: 5
        #   periodSeconds: 5
        #   httpGet:
        #     path: /login
        #     port: 8080
        # livenessProbe:
        #   initialDelaySeconds: 5
        #   timeoutSeconds: 5
        #   failureThreshold: 5
        #   httpGet:
        #     path: /login
        #     port: 8080
      - name: scheduler
        image: image-name
        imagePullPolicy: IfNotPresent
        env:
        - name: namespace_name
          valueFrom:
            fieldRef:
              fieldPath: metadata.namespace
        - name: SQL_ALCHEMY_CONN
          valueFrom:
            secretKeyRef:
              name: airflow-secrets
              key: sql_alchemy_conn
        command: ["/bin/sh", "-c"]
        args: ["cp ./dags/* /root/airflow/dags/; airflow scheduler"]
        volumeMounts:
        - name: airflow-configmap
          mountPath: /root/airflow/airflow.cfg
          subPath: airflow.cfg
        - name: airflow-dags
          mountPath: /root/airflow/dags
        - name: airflow-logs
          mountPath: /root/airflow/logs
      volumes:
      - name: airflow-configmap
        configMap:
          name: airflow-configmap
      - name: airflow-dags
        persistentVolumeClaim:
          claimName: airflow-dags
      - name: airflow-logs
        persistentVolumeClaim:
          claimName: airflow-logs
---
apiVersion: v1
kind: Service
metadata:
  name: airflow
  namespace: data
spec:
  type: NodePort
  ports:
    - port: 8080
      nodePort: 30809
  selector:
    name: airflow

UI

重新启动worker和webserver。

您的工作人员和Web服务器正在使用旧的fernet密钥进行操作。 您更改了配置中的密钥,因此所有新存储或修改的Connections将使用新密钥,但Web服务器/工作者仍在使用旧密钥。 它们永远不会匹配并继续发出此错误,直到它们重新启动。

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM