简体   繁体   English

发送指标:将火花发送到石墨

[英]Sending Metrics: Spark to Graphite

We managed to get Spark (2.x) to send metrics to graphite by changing the metrics.properties file as below: 我们通过更改metrics.properties文件,使Spark(2.x)将度量标准发送到石墨:

# Enable Graphite
*.sink.graphite.class=org.apache.spark.metrics.sink.GraphiteSink
*.sink.graphite.host=graphite-host
*.sink.graphite.port=2003
*.sink.graphite.period=5
*.sink.graphite.prefix=my-app

However I noticed that we are getting only a subset of the metrics in graphite compared to what we get under Monitoring Web UI ( http://localhost:4040 ). 但是,我注意到,与在Monitoring Web UI( http:// localhost:4040 )下获得的相比,石墨仅获得了一部分度量标准。 Is there any settings to get all the metrics (including Accumulators) in graphite? 是否有任何设置可以获取石墨中的所有度量标准(包括累加器)?

I use this library to sink user defined metrics in user code into Graphite: spark-metrics 我使用此库将用户代码中的用户定义指标下沉到Graphite中: spark-metrics

Initialise the metric system in driver side: 在驱动程序端初始化度量系统:

UserMetricsSystem.initialize(sc, "test_metric_namespace")

Then use Counter Gauge Histogram Meter like Spark Accumulators : 然后用Counter Gauge Histogram Meter像星火Accumulators

UserMetricsSystem.counter("test_metric_name").inc(1L)

For Spark 2.0 , you can specify --conf spark.app.id=job_name so that in Grafana, metrics from different job run with multiple application id could have the same metric name. 对于Spark 2.0 ,您可以指定--conf spark.app.id=job_name以便在Grafana中,来自具有多个应用程序ID的不同作业的度量标准可以具有相同的度量标准名称。 Eg without setting spark.app.id , the metric name may include application id like this: 例如,如果未设置spark.app.id ,那么度量标准名称可能包括如下应用程序ID:

job_name.application_id_1.metric_namespace.metric_name

But with setting spark.app.id , it looks like: 但是通过设置spark.app.id ,它看起来像:

job_name.unique_id.metric_namespace.metric_name

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM