[英]why logs are not visible for Dataflow job pipeline written in java sdk at GCP?
I have enabled the logging API by below command:我通过以下命令启用了日志记录 API:
gcloud services enable logging
I have followed the steps: https://cloud.google.com/dataflow/docs/quickstarts/create-pipeline-java to create the pipeline.我已按照以下步骤操作: https://cloud.google.com/dataflow/docs/quickstarts/create-pipeline-java来创建管道。
I am using slf4j logger lib for logging the jobs.我正在使用 slf4j 记录器库来记录作业。 still, no logs are visible in GCP dataflow console.
仍然,在 GCP 数据流控制台中看不到任何日志。
Creating a new sink in Logs Router with an inclusion filter of resource.type="dataflow_step"
solved the issue as dataflow logs are excluded in the default sink present at the log router under logging.使用
resource.type="dataflow_step"
的包含过滤器在日志路由器中创建一个新接收器解决了这个问题,因为数据流日志被排除在记录下日志路由器的默认接收器中。
Are you using a service account?您使用的是服务帐户吗? Besides enabling the Logging service, you may not have permission to write the the logs.
除了启用日志记录服务外,您可能没有写入日志的权限。
If that is the case, please double-check if you are following all the instructions from Security and permissions for pipelines on Google Cloud .如果是这种情况,请仔细检查您是否遵循了Google Cloud 上的管道安全和权限中的所有说明。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.