I'd really like to post from BigQuery SQL routines into Pub/Sub -- for example, to post a "Completed" message after a routine finishes, push out debug info, etc. Unfortunately, all the information I can find is about pushing data from Pub/Sub into BigQuery tables, which is the opposite of what I need.
I know there aren't any SQL commands to do that, but it seems to me that if you can write a JavaScript Cloud Function that posts to Pub/Sub (which you obviously can), then shouldn't you be able to write a persistent JS UDF that does the same thing?
(Unfortunately, I have decent Python skills, but no JavaScript...if BigQuery allowed Python UDFs, I could probably figure this out on my own.) TIA for any help.
You can use eventarc triggers for this.
When a BigQuery job completes, a new entry is added into Cloud logging within your GCP project and this can be used to trigger a Cloud Run service.
The best is to refer to this article , which I find really good and adapt it to your needs.
And you can use Python!
In addition of eventarc, you have 2 other solutions:
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.