简体   繁体   中英

How to create connector in airflow that is of type external provider (like the google-cloud-plaform) with the airflow REST API

I'm trying to automate creation of connector in airflow by github action, but since it is an external provider, the payload that need to be sent to airflow REST API doesn't work and i didn't find any documentation on how to do it.

So here is the PAYLOAD i'm trying to send:

PAYLOAD = {
    "connection_id": CONNECTOR,
    "conn_type": "google_cloud_platform",
    "extra": json.dumps({
        "google_cloud_platform": {
            "keyfile_dict" : open(CONNECTOR_SERVICE_ACCOUNT_FILE, "r").read(),
            "num_retries" : 2,
        }
    })
}

According to the airflow documentation here

And the information i found on the "create connector" page of airflow UI: Airflow UI create connector page

But i received no error (code 200) and the connector is created but doesn't have the settings i tried to configure.

I confirm the creation works on the UI.

Does anyone have a solution or document that refer to the exact right payload i need to sent to airflow rest api? Or maybe i miss something.

  • Airflow version: 2.2.3+composer
  • Cloud Composer version (GCP): 2.0.3
  • Github runner version: 2.288.1
  • Language: Python

Thanks guys and feel free to contact me for further questions.

Bye

@vdolez was write, it's kind of a pain to format the payload to have the exact same format airflow REST API want. it's something like this:

"{\"extra__google_cloud_platform__key_path\": \"\", 
\"extra__google_cloud_platform__key_secret_name\": \"\", 
\"extra__google_cloud_platform__keyfile_dict\": \"{}\", 
\"extra__google_cloud_platform__num_retries\": 5, 
\"extra__google_cloud_platform__project\": \"\", 
\"extra__google_cloud_platform__scope\": \"\"}"

And when you need to nest dictionnary inside some of these field, not worth the time and effort. But in case someone want to know, you have to escape every special character.

I change my workflow to notify competent users to create connector manually after my pipeline succeed.

I will try to contact airflow/cloud composer support to see if we can have a feature for better formatting.

You might be running into encoding/decoding issues while sending data over the web.

Since you're using Composer, it might be a good idea to use Composer CLI to create a connection .

Here's how to run airflow commands in Composer:

gcloud composer environments run ENVIRONMENT_NAME \
    --location LOCATION \
    SUBCOMMAND \
    -- SUBCOMMAND_ARGUMENTS

Here's how to create a connection with the native Airflow commands:

airflow connections add 'my_prod_db' \
    --conn-type 'my-conn-type' \
    --conn-login 'login' \
    --conn-password 'password' \
    --conn-host 'host' \
    --conn-port 'port' \
    --conn-schema 'schema' \
    ...

Combining the two, you'll get something like:

gcloud composer environments run ENVIRONMENT_NAME \
    --location LOCATION \
    connections \
    -- add 'my_prod_db' \
    --conn-type 'my-conn-type' \
    --conn-login 'login' \
    --conn-password 'password' \
    --conn-host 'host' \
    --conn-port 'port' \
    --conn-schema 'schema' \
    ...

You could run this in a Docker image where gcloud is already installed .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM