简体   繁体   中英

How to get Airflow db credentials from Google Cloud Composer

I am in current need of Airflow db connection credentials for my Airflow instance in Cloud Composer.
All I see on Airflow connection UI is airflow_db mysql airflow-sqlproxy-service .

I would like to connect to it via DataGrip.
Another thing is if I want to change the [core] sql_alchemy_conn override environmental variable, how do I do it as it is restrictedw hen I add it on my env variable on Cloud Composer environments.

Cloud Composer isn't designed to give external access to the database. However you can connect to the GKE cluster and then access it from within the cluster. This doc shows how to do that with SQLAlchemy, but you can also get direct MySQL CLI access by running mysql -h $AIRFLOW_SQLPROXY_SERVICE_SERVICE_HOST -u root airflow-db instead of sqlalchemy in step 6.

Adding to David's answer:

After following directions to connect to the GKE cluster worker the $AIRFLOW_SQLPROXY_SERVICE_SERVICE_HOST env variable did not exist for me. Instead, I parsed the connection details from the SQLAlchemy connection string env variable that Composer makes when SQLAlchemy is installed.

echo $AIRFLOW__CORE__SQL_ALCHEMY_CONN

Should return a connection string of the form: mysql+mysqlconnector://<user>:<password>@<host>[:<port>]/<dbname> .

Parsing the host, user and dbname from this i'm able to connect with:

mysql -h <host> -u <user> <dbname>

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM