I am trying to pass secret variables to my Kube.netesPodOperator
in airflow
Here is what I have done:
secret.yaml
file that looks like the followingapiVersion: v1
kind: Secret
metadata:
name: my-secret
type: Opaque
data:
SECRET_1: blabla
SECRET_2: blibli
kubectl apply -f ./secret.yaml
from airflow.contrib.kubernetes.secret import Secret
from airflow.contrib.operators.kubernetes_pod_operator import KubernetesPodOperator
from airflow.models import DAG
SECRET_1 = Secret(
deploy_type="env", deploy_target="SECRET_1", secret="ai-controller-object-storage", key="SECRET_1"
)
SECRET_2 = Secret(
deploy_type="env", deploy_target="SECRET_2", secret="ai-controller-object-storage", key="SECRET_2"
)
with DAG(...) as dag:
KubernetesPodOperator(
task_id=..,
trigger_rule="all_success",
namespace="default",
image=IMAGE,
startup_timeout_seconds=600,
secrets=[
SECRET_1,
SECRET_2], ...)
So now as I understand, I should access SECRET_1
as environnement variable in my container from Kube.netesPodOperator
However my first task from a python script (with os.environ["SECRET_1"]
) returns an error that indicates that this environment variable does not exist:
KeyError: 'SECRET_1'
How can I access this variable from my python script then?
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.