简体   繁体   English

用于 BigQuery 存储 API 上 Apache Beam 2.39.0 和 DataFlow 运行器的 PERMISSION_DENIED

[英]PERMISSION_DENIED for BigQuery Storage API on Apache Beam 2.39.0 and DataFlow runner

I have the following error for one of my DataFlow Jobs:我的一项 DataFlow 作业出现以下错误:

2022-06-15T16:12:27.365182607Z Error message from worker: java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException: com.google.api.gax.rpc.PermissionDeniedException: io.grpc.StatusRuntimeException: PERMISSION_DENIED: BigQuery Storage API has not been used in project 770406736630 before or it is disabled. 2022-06-15T16:12:27.365182607Z Error message from worker: java.lang.RuntimeException: org.apache.beam.sdk.util.UserCodeException: java.lang.RuntimeException: java.lang.RuntimeException: java.lang.RuntimeException : com.google.api.gax.rpc.PermissionDeniedException: io.grpc.StatusRuntimeException: PERMISSION_DENIED: BigQuery Storage API has not been used in project 770406736630 before or it is disabled. Enable it by visiting https://console.developers.google.com/apis/api/bigquerystorage.googleapis.com/overview?project=770406736630 then retry.通过访问https://console.developers.google.com/apis/api/bigquerystorage.googleapis.com/overview?project=770406736630启用它,然后重试。 If you enabled this API recently, wait a few minutes for the action to propagate to our systems and retry.如果您最近启用了此 API,请等待几分钟,以便该操作传播到我们的系统并重试。

The same code works fine with Apache Beam 2.38.0.相同的代码适用于 Apache Beam 2.38.0。 I tested multiple times and this is not a temporary issues.我测试了多次,这不是暂时的问题。 The project number mentioned in the error (770406736630) is not mine.错误(770406736630)中提到的项目号不是我的。

Any idea why I get this error?知道为什么我会收到此错误吗?

I had the same issue.我遇到过同样的问题。 I'm using Spring Cloud GCP and hadn't set the spring.cloud.gcp.project-id property, which I'm guessing makes the SDK or API use some default value. I'm using Spring Cloud GCP and hadn't set the spring.cloud.gcp.project-id property, which I'm guessing makes the SDK or API use some default value.

I don't know how you've set up you environment, because you haven't specified, but look into how you can explicitly set the project id.我不知道您是如何设置环境的,因为您没有指定,但请查看如何显式设置项目 ID。 You can get it from the dialog for selecting a project in GCP Console.您可以从 GCP Console 中选择项目的对话框中获取它。

I just ran into this, and simply needed to re-authenticate with the gcp cli by running gcloud auth application-default login .我刚刚遇到了这个问题,只需要通过运行gcloud auth application-default login来重新使用 gcp cli 进行身份验证。

The error happens for the latest Apache Beam SKD (2.41.0) when BigQueryIO.Write.Method.STORAGE_WRITE_API is used and destination does not specify the project name.当使用BigQueryIO.Write.Method.STORAGE_WRITE_API并且目标未指定项目名称时,最新的 Apache Beam SKD (2.41.0) 会发生错误。 For example dataset.table instead of project-id:dataset.table例如dataset.table而不是project-id:dataset.table

This is the solution that worked for me:这是对我有用的解决方案:

    BigQueryIO.writeTableRows()
        .to("project-id:dataset.table")
        .withMethod(BigQueryIO.Write.Method.STORAGE_WRITE_API)

For some reason the Apache Beam implementation for BigQuery Write Storage API does not handle this situation even though it works fine for FILE_LOADS method.出于某种原因,BigQuery 写入存储 API 的 Apache Beam 实现无法处理这种情况,即使它适用于FILE_LOADS方法。

You may also receive a sightly different error for the latest Beam SDK.对于最新的 Beam SDK,您可能还会收到明显不同的错误。

Exception in thread "main" org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.lang.RuntimeException: 
java.lang.RuntimeException: 
java.lang.RuntimeException: com.google.api.gax.rpc.PermissionDeniedException:
io.grpc.StatusRuntimeException: 
PERMISSION_DENIED: Permission denied: Consumer 'project:null' has been suspended.

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM