简体   繁体   English

如何最好地使用云功能处理来自谷歌长期运行操作的异步响应

[英]How best to handle async response from google long running operation with cloud functions

I'm using Google Cloud Functions (python) to initiate an asset inventory export from GCP by calling the exportAssets() method here .我正在使用 Google Cloud Functions (python) 通过在此处调用 exportAssets() 方法从 GCP 启动资产清单导出。 The method returns an Operations object defined here which can be used to poll the operation until it is complete.该方法返回此处定义的操作 object,可用于轮询操作,直到操作完成。 Of course since this is a cloud function I'm limited to 540 seconds so cannot do that forever.当然,因为这是一个云 function 我被限制在 540 秒,所以不能永远这样做。 The google api python client offers the add_done_callback() method where one can await an async response, but as far as I can tell it requires me to keep a thread alive within the cloud function. google api python 客户端提供了 add_done_callback() 方法,人们可以在其中等待异步响应,但据我所知,它要求我在云中保持线程处于活动状态 ZC1C425268E68385D1AB5074F Is there a way to tell the Asset Inventory API executing the operation to send the aync response (success or failure) to a pubsub topic where I can properly handle the response?有没有办法告诉资产清单 API 执行操作以将 aync 响应(成功或失败)发送到我可以正确处理响应的 pubsub 主题? Trying to avoid spinning up an appengine instance with basic_scaling to support 24 hour timeouts.试图避免使用 basic_scaling 启动 appengine 实例以支持 24 小时超时。

    from google.cloud import asset_v1
    # .....
    # Setup request to asset inventory API
    parent = "organizations/{}".format(GCP_ORGANIZATION)
    requested_type = 'RESOURCE'
    dataset = 'projects/{}/datasets/gcp_assets_{}'.format(GCP_PROJECT, requested_type)
    partition_spec = asset_v1.PartitionSpec
    partition_key = asset_v1.PartitionSpec.PartitionKey.REQUEST_TIME
    partition_spec.partition_key = asset_v1.PartitionSpec.PartitionKey.REQUEST_TIME

    output_config = asset_v1.OutputConfig()
    output_config.bigquery_destination.dataset = dataset
    output_config.bigquery_destination.table = 'assets'
    output_config.bigquery_destination.separate_tables_per_asset_type = True
    output_config.bigquery_destination.partition_spec.partition_key = partition_key

    # Make API request to asset inventory API
    print("Creating job to load 'asset types: {}' to {}".format(
        requested_type,
        dataset
    ))
    response = ASSET_CLIENT.export_assets(
        request={
            "parent": parent,
            "content_type": content_type,
            "output_config": output_config,
        }
    )
    print(response.result())  # This waits for the job to complete

Cloud Asset inventory export doesn't offer a PubSub notification at the end of the export. Cloud Asset 清单导出不会在导出结束时提供 PubSub 通知。 However, in my previous company, it took about 5 minutes to export 100k+ assets;但是,在我之前的公司中,导出 100k+ 资产大约需要 5 分钟; it's not so bad, And if you have more assets.这还不错,如果你有更多的资产。 I'm sure you can contact Google Cloud (use your Customer Engineer) to add this notification in the roadmap.我确定您可以联系 Google Cloud(使用您的客户工程师)在路线图中添加此通知。


Anyway, if you want to build a workaround, you can use workflows .无论如何,如果你想构建一个解决方法,你可以使用workflows

  • Use a Cloud Function to trigger your workflow使用 Cloud Function 触发您的工作流程
  • In your workflow,在您的工作流程中,
    • Call the Cloud Asset API to export data to BigQuery调用 Cloud Asset API 将数据导出到 BigQuery
    • Get the response and perform a loop (test the export job status, if not OK, sleep X seconds and test again)获取响应并执行循环(测试导出作业状态,如果不正常,睡眠 X 秒并再次测试)
    • When the job is over, call PubSub API (or directly a Cloud Function) to submit the job status and process it.作业结束后,调用 PubSub API(或直接调用 Cloud Function)提交作业状态并进行处理。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 使用 Google Cloud Functions 长时间运行的 Python 进程 - Long running python process with Google Cloud Functions 如何在以后获得长期运行的Google Cloud Speech API操作的结果? - How to get the result of a long-running Google Cloud Speech API operation later? 获取long_running_recognize操作的进度(Google Cloud Speech API) - Get progress of a long_running_recognize operation (Google Cloud Speech API) 部署到 Google Cloud Functions 时如何处理密钥和凭据? - How to handle keys and credentials when deploying to Google Cloud Functions? Google上用于长期任务的云功能的替代方案 - Alternative to Cloud Functions for Long Duration Tasks on Google 处理Web服务器上长时间运行任务的最佳方法 - Best way to handle long-running tasks on a web server 如何使用Mock对Google Cloud Functions进行单元测试时处理abort() - How to handle abort() while unit-testing Google Cloud Functions using Mock 如何使用 Google Cloud Functions 从 JIRA 转发 webhook 数据? - How to forward webhook data from JIRA using Google Cloud Functions? 如何从谷歌云端点python返回动态json作为响应 - how to return dynamic json as response from google cloud endpoint python 如何从 Google Cloud Functions 读取存储在 Google Cloud Storage 上的非文本文件 - How to read non-text file stored on Google Cloud Storage from Google Cloud Functions
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM