简体   繁体   中英

GCP Pub/Sub - How to retrieve state from BQ scheduled query

I have a Big Query scheduled query that triggers a cloud function via pub/sub.

I want the function to read the "state" value from the pub/sub message so I can see if it completed successfully.

The below will always trigger the else statement. If the if statement is removed, it will return a KeyError.

import base64

def hello_pubsub(event, context):
    data = base64.b64decode(event['data']).decode('utf-8')

    if 'state' in data:
        state = data['state']
        print("returned state: " + state)
    else:
        print ("No state attribute found")

Here is the pubsub message the function should receive:

{
"data":
{"dataSourceId": "scheduled_query", 
"destinationDatasetId": "xxxxxxxxxx", 
"emailPreferences": { }, 
"endTime": "2020-03-12T20:40:13.627285Z", 
"errorStatus": { },
"name": "xxxxxxxxxx", "notificationPubsubTopic": "projects/xxxxxxxxxx/topics/xxxxxxxxxx", 
"params": { "destination_table_name_template": "xxxxxxxxxx", "query": "xxxxxxxxxx", "write_disposition": "WRITE_TRUNCATE" }, 
"runTime": "2020-03-05T10:00:00Z", 
"scheduleTime": "2020-03-12T20:37:13.17166Z", 
"startTime": "2020-03-12T20:37:13.328479Z", 
"state": "SUCCEEDED", 
"updateTime": "2020-03-12T20:40:13.627307Z", 
"userId": "xxxxxxxxxx"
}
}

I have figured it out.

data = base64.b64decode(event['data']).decode('utf-8')

This returns a json formatted string, not a dictionary object. You need to convert to dict via:

data_dict = json.loads(data)

In order to be able to access it like a dictionary.

You can have a look to the python library here

You have also the documentation

Eventually, you can check additional fields added in the JSON notification message

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM