简体   繁体   中英

Google dataflow job with Apache Beam 2.9.0 Java SDK stuck

I'm using Beam Java SDK 2.9.0, My job reads from Kafka in the step. My job works just fine on Direct runner. When I deploy it on Dataflow the job is stuck and I don't see any progress. The Dataflow monitoring UI shows

Output collections
EventKafkaReadTransform/Values/Values/Map.out0
Elements added  
–
Estimated size  
–

The stackdriver logs seems to be going in a loop with the below messages

Error syncing pod 75bf4f18ce7d4d30a2b7de627656b517 ("dataflow-eventingestjob-xxx-0-02062225-wxsc-harness-r3kq_default(75bf4f18ce7d4d30a2b7de627656b517)"), skipping: failed to "StartContainer" for "java-streaming" with CrashLoopBackOff: "Back-off 5m0s restarting failed container=java-streaming pod=dataflow-eventingestjob-xxx-0-02062225-wxsc-harness-r3kq_default(75bf4f18ce7d4d30a2b7de627656b517)

I cannot figure what else to look for.

Any help is appreciated

We had something similar and found that is was an inability to start the workers (for us due to an slf4j issue, but it could be anything).

If you look at the Stackdriver Logs (view Logs in the UI, and click the link to go to Stackdriver) you should be able to view the worker_startup logs.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM