简体   繁体   中英

How to set GCP credential in Eclipse to run Dataflow pipeline

I have a pipeline which have been developed using Java in Eclipse.

After install the Cloud SDK for Eclipse I can run the pipeline locally (direct runner) with the Dataflow configuration:

数据流

I would also like to run it locally using maven configuration.

The problem I have if I try to execute it using maven is about credential when the pipeline try to create a pubsub subscription.

Exception: org.apache.beam.sdk.Pipeline$PipelineExecutionException: java.lang.RuntimeException: Failed to create subscription to topic projects/xxx/topics/test on project projects/xxx: 403 Forbidden

{
  "code" : 403,
  "errors" : [ {
    "domain" : "global",
    "message" : "The request is missing a valid API key.",
    "reason" : "forbidden"
  } ],
  "message" : "The request is missing a valid API key.",
  "status" : "PERMISSION_DENIED"
}
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time:  7.154 s
[INFO] Finished at: 2019-12-12T11:19:36+01:00
[INFO] ------------------------------------------------------------------------

I have the cloud shell installed and configured. Also set the env variable GOOGLE_APPLICATION_CREDENTIALS


I also have the same error if I execute the mvn command in my cmd (also gcp shell)

You can run the below command in a shell and then try to run the job

gcloud auth application-default login

Alternatively you can also see here on how to setup eclipse for GCP - https://cloud.google.com/dataflow/docs/quickstarts/quickstart-java-eclipse

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM