[英]How to access Cloud Composer system files
I am working on a migration task from an on-premise system to a cloud composer, the thing is that Cloud composer is a fully managed version of airflow which restrict access to file systems behind, actually on my on-premise system I have a lot of environment variables for some paths we're saving them like /opt/application/folder_1/subfolder_2/...
.我正在处理从内部部署系统到云 Composer 的迁移任务,问题是 Cloud composer 是一个完全托管的气流版本,它限制了对后面文件系统的访问,实际上在我的内部部署系统上我有很多我们将某些路径的环境变量保存为
/opt/application/folder_1/subfolder_2/...
。
When looking at the Cloud composer documentation, they said that you can access and save your data on the data folder which is mapped by /home/airflow/gcs/data/
which implies that in case I move forward that mapping, I will be supposed to change my env variables values to something like : /home/airflow/gcs/data/application/folder_1/folder_2
things that could be a bit painful, knowing that I'm running many bash scripts that rely on those values.在查看 Cloud composer 文档时,他们说您可以访问数据并将数据保存在由
/home/airflow/gcs/data/
映射的数据文件夹中,这意味着如果我继续该映射,我会认为将我的 env 变量值更改为类似: /home/airflow/gcs/data/application/folder_1/folder_2
可能有点痛苦的事情,因为我知道我正在运行许多依赖于这些值的 bash 脚本。
Is there any approach to solve such problem ?有没有办法解决这样的问题?
You can specify your env variables during Composer creation/update process [1].您可以在 Composer 创建/更新过程中指定您的 env 变量 [1]。 These vars are then stored in the YAML files that create the GKE cluster where Composer is hosted.
然后,这些变量存储在 YAML 文件中,这些文件创建了托管 Composer 的 GKE 集群。 If you SSH into a VM running the Composer GKE cluster, then enter one of the worker containers and run
env
, you can see the env variables you specified.如果您通过 SSH 连接到运行 Composer GKE 集群的 VM,然后进入其中一个工作容器并运行
env
,您可以看到您指定的 env 变量。
[1] https://cloud.google.com/composer/docs/how-to/managing/environment-variables [1] https://cloud.google.com/composer/docs/how-to/managing/environment-variables
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.