简体   繁体   English

如何运行用 Golang 编写的 GCP Cloud Function 以运行数据流作业以将文本文件导入 Spanner?

[英]How to run a GCP Cloud Function written in Golang to run a Dataflow job to import text file to Spanner?

I have used the example in: https://github.com/apache/beam/blob/master/sdks/go/examples/wordcount/wordcount.go#L82 as well as the advice from Google Cloud Support to use the following to run a Dataflow import job:我使用了以下示例: https://github.com/apache/beam/blob/master/sdks/go/examples/wordcount/wordcount.go#L82以及 Google Cloud Support 的建议,使用以下内容运行数据流导入作业:

    flag.Parse()
    flag.Set("runner", "dataflow")
    flag.Set("project"xxxx "rp")
    flag.Set("region", "us-central1")
    flag.Set("staging_location", "gs://xxx/temp")
    flag.Set("job_name", "import-delivery_schedule")
    beam.Init()
    p := beam.NewPipeline()
    p.Root()
    if err := beamx.Run(context.Background(), p); err != nil {
        log.Fatalf("Failed to execute job: %v", err)
    }

The cloud function is in the same project as the database.云function与数据库在同一个项目中。 The Dataflow import job is in the same project as the database.数据流导入作业与数据库位于同一项目中。 The import job runs successfully from the console.导入作业从控制台成功运行。

However, I'm unable to get this to work.但是,我无法让它工作。

I'm getting this error: "Function execution took 18 ms, finished with status: 'connection error'"我收到此错误:“函数执行耗时 18 毫秒,完成状态为:‘连接错误’”

I'm open to other ways to import a text file into a spanner table.我对将文本文件导入扳手表的其他方法持开放态度。 What would you suggest?你有什么建议?

If the import dataflow job is getting created and running successfully, then there seems to be no problem with the GCP cloud function.如果导入数据流作业正在创建并成功运行,那么 GCP 云 function 似乎没有问题。

Ensure the dataflow workers have sufficient permissions to access the database.确保数据流工作者有足够的权限访问数据库。 https://cloud.google.com/spanner/docs/import#iam These permissions are required for a dataflow job to access and write to spanner (or a subset of them depending on the type of modifications you are doing in the import). https://cloud.google.com/spanner/docs/import#iam数据流作业需要这些权限才能访问和写入 spanner(或其中的一部分,具体取决于您在导入中所做的修改类型) . Include these roles to the service account which the dataflow workers are assuming.将这些角色包含到数据流工作者所承担的服务帐户中。 https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#worker-service-account https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#worker-service-account

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何将非模板化的梁作业转换为模板化作业并在 GCP Dataflow 运行器上运行? - How to convert a non-templated beam job to templated job and run it on GCP Dataflow runner? 保护从 gcp 云函数到 Cloud run Api 的 api 调用 - Protect api call from gcp cloud function to Cloud run Api 如何运行调用另一个 GCP Cloud Run 获取数据的 GCP Cloud run - How to run a GCP Cloud run that calls another GCP Cloud Run for data 从云端在 GKE 中运行作业 function - Run Job in GKE from a cloud function 如何将参数传递给 Google Cloud Run Job - How to pass parameters to Google Cloud Run Job 使用 Google Cloud Dataflow flex 模板时,是否可以使用多命令 CLI 来运行作业? - When using Google Cloud Dataflow flex templates, is it possible to use a multi-command CLI to run a job? 如何测试写在 GCP function 文件中的 function - How do I test a function written in a GCP function file 将 docker-compose 部署到 Cloud Run 或 GCP? - Deploying docker-compose to Cloud Run OR GCP? GCP Cloud Run:应用程序执行可能失败 - GCP Cloud Run : Application exec likely failed Google Cloud SDK 创建 Cloud Run Job - Google Cloud SDK to create a Cloud Run Job
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM