简体   繁体   English

Apache Beam Counter/Metrics 在 Flink WebUI 中不可用

[英]Apache Beam Counter/Metrics not available in Flink WebUI

I'm using Flink 1.4.1 and Beam 2.3.0, and would like to know is it possible to have metrics available in Flink WebUI (or anywhere at all), as in Dataflow WebUI?我正在使用 Flink 1.4.1 和 Beam 2.3.0,并且想知道是否可以像在 Dataflow WebUI 中那样在 Flink WebUI(或任何地方)中使用指标?

I've used counter like:我用过这样的计数器:

import org.apache.beam.sdk.metrics.Counter;
import org.apache.beam.sdk.metrics.Metrics;
...
Counter elementsRead = Metrics.counter(getClass(), "elements_read");
...
elementsRead.inc();

but I can't find "elements_read" counts available anywhere (Task Metrics or Accumulators) in Flink WebUI.但我在 Flink WebUI 的任何地方(任务指标或累加器)都找不到可用的"elements_read"计数。 I thought this will be straightforward after BEAM-773 .我认为这在BEAM-773之后会很简单。

Once you have selected a job in your dashboard, you will see the DAG for that job and below the DAG there are a list of tabs.在仪表板中选择作业后,您将看到该作业的 DAG,并且在 DAG 下方有一个选项卡列表。

  1. Click on "Task Metrics" Tab单击“任务指标”选项卡
  2. Click on a box of your DAG单击您的 DAG 的一个框
  3. Click on the Add Metric Button, to show that operator metric单击“添加指标”按钮,以显示该操作员指标

显示指标的 Flink 仪表板

If your pipeline is running in detached mode, metrics are not supported to be queried.如果您的管道在分离模式下运行,则不支持查询指标。 Refer this .参考这个

public class FlinkDetachedRunnerResult implements PipelineResult {

  FlinkDetachedRunnerResult() {}

  @Override
  public State getState() {
    return State.UNKNOWN;
  }

  @Override
  public MetricResults metrics() {
    throw new UnsupportedOperationException("The FlinkRunner does not currently support metrics.");
  }

  @Override
  public State cancel() throws IOException {
    throw new UnsupportedOperationException("Cancelling is not yet supported.");
  }

  @Override
  public State waitUntilFinish() {
    return State.UNKNOWN;
  }

  @Override
  public State waitUntilFinish(Duration duration) {
    return State.UNKNOWN;
  }

  @Override
  public String toString() {
    return "FlinkDetachedRunnerResult{}";
  }
}

However, I was able to view the metrics using slf4j reporter但是,我能够使用 slf4j 记者查看指标

from apache_beam.metrics.metric import Metrics
from apache_beam.metrics.metric import MetricsFilter
from apache_beam.options.pipeline_options import PipelineOptions
import apache_beam as beam
import csv
import logging

GAME_DATA = [
'user1_team1,team1,18,1447686663000,2015-11-16 15:11:03.921',
'user1_team1,team1,18,1447690263000,2015-11-16 16:11:03.921',
'user2_team2,team2,2,1447690263000,2015-11-16 16:11:03.955',
'user3_team3,team3,8,1447690263000,2015-11-16 16:11:03.955',
'user4_team3,team3,5,1447690263000,2015-11-16 16:11:03.959',
'user1_team1,team1,14,1447697463000,2015-11-16 18:11:03.955',
'robot1_team1,team1,9000,1447697463000,2015-11-16 18:11:03.955',
'robot2_team2,team2,1,1447697463000,2015-11-16 20:11:03.955',
'robot2_team2,team2,9000,1447697463000,2015-11-16 21:11:03.955',
'robot1_team1,1000,2447697463000,2915-11-16 21:11:03.955',
'robot2_team2,9000,1447697463000,2015-11-16 21:11:03.955']

class ParseGameEventFn(beam.DoFn):
    def __init__(self):
        super(ParseGameEventFn, self).__init__()
    self.game_events = Metrics.counter(self.__class__, 'game_events')

    def process(self, element, *args, **kwargs):
        try:
            self.game_events.inc()
            row = list(csv.reader([element]))[0]
            if int(row[2]) < 5:
               return
            yield {
                'user': row[0],
                'team': row[1],
                'score': int(row[2]),
                'timestamp': int(row[3]) / 1000.0,
            }
        except Exception as ex:
            logging.error('Parse error on {}: {}'.format(element, ex))

with beam.Pipeline(options=pipeline_options) as pipeline:
    results = (
        pipeline
        | "Create" >> beam.Create(GAME_DATA)
        | "Parsing" >> beam.ParDo(ParseGameEventFn())
        | "AddEventTimestamps" >> beam.Map(
             lambda elem: beam.window.TimestampedValue(elem, elem['timestamp']))
        | "Print" >> beam.Map(print))

metric_results = pipeline.result.metrics().query(MetricsFilter().with_name('game_events'))
outputs_user_counter = metric_results['counters'][0]
print(outputs_user_counter.committed)

Flink Configuration for Prometheus in conf/flink-conf.yaml conf/flink-conf.yaml 中 Prometheus 的 Flink 配置

metrics.reporters: prom
metrics.reporter.prom.class: org.apache.flink.metrics.prometheus.PrometheusReporter
metrics.reporter.prom.port: 9250-9260

I can see the metrics in Accumulators Tab but not in Metrics Tab.. I'm using Flink Version: 1.12.0.. With Latest Apache Beam Master Branch Code..我可以在 Accumulators 选项卡中看到指标,但在 Metrics 选项卡中看不到。我使用的是 Flink 版本:1.12.0.. 使用最新的 Apache Beam Master 分支代码..

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Flink webUI 中的“无指标” - "No Metrics " in Flink webUI 带有 flink 的 apache 光束中的 CEP - CEP in apache beam with flink Apache Flink 仪表板未显示指标 - Apache Flink Dashboard not showing metrics 在Apache FLink中的不同窗口上计算指标 - Compute metrics on different window in Apache FLink 如何在 Apache Flink ui 中显示指标? - How to display metrics in Apache Flink ui? Flink runner 上的 Beam:ClassNotFoundException:org.apache.beam.runners.flink.translation.wrappers.streaming.WorkItemKeySelector - Beam on Flink runner: ClassNotFoundException: org.apache.beam.runners.flink.translation.wrappers.streaming.WorkItemKeySelector Apache Beam:IllegalStateException - 值仅在升级到 Beam 2.41.0 后的运行时可用 - Apache Beam: IllegalStateException - Value only available at runtime after upgrading to beam 2.41.0 使用kafka + beam + flink时。 找到了接口org.apache.flink.streaming.api.operators.InternalTimer,但是期望使用类 - When using kafka+beam+flink. Found interface org.apache.flink.streaming.api.operators.InternalTimer, but class was expected java.lang.NoSuchMethodError: &#39;org.apache.flink.metrics.MetricGroup org.apache.flink.api.common.functions.RuntimeContext.getMetricGroup()&#39; - java.lang.NoSuchMethodError: 'org.apache.flink.metrics.MetricGroup org.apache.flink.api.common.functions.RuntimeContext.getMetricGroup()' 在 Flink 运行器上通过 KafkaIO 消费事件时,Apache Beam Pipeline 失败 - Failing Apache Beam Pipeline when consuming events through KafkaIO on Flink runner
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM