简体   繁体   English

Fluentbit收集的Serilog日志到kubernetes中的Elasticsearch并没有正确地解析Json

[英]Serilog logs collected by Fluentbit to Elasticsearch in kubernetes doesnt get Json-parsed correctly

Using the EFK Stack on Kubernetes (Minikube). 在Kubernetes(Minikube)上使用EFK Stack。 Have an asp.net core app using Serilog to write to console as Json. 有一个asp.net核心应用程序使用Serilog写入控制台作为Json。 Logs DO ship to Elasticsearch, but they arrive unparsed strings , into the "log" field, this is the problem. 日志DO发送到Elasticsearch, 但它们到达未解析的字符串 ,进入“日志”字段,这就是问题所在。

This is the console output: 这是控制台输出:

{
    "@timestamp": "2019-03-22T22:08:24.6499272+01:00",
    "level": "Fatal",
    "messageTemplate": "Text: {Message}",
    "message": "Text: \"aaaa\"",
    "exception": {
        "Depth": 0,
        "ClassName": "",
        "Message": "Boom!",
        "Source": null,
        "StackTraceString": null,
        "RemoteStackTraceString": "",
        "RemoteStackIndex": -1,
        "HResult": -2146232832,
        "HelpURL": null
    },
    "fields": {
        "Message": "aaaa",
        "SourceContext": "frontend.values.web.Controllers.HomeController",
        "ActionId": "0a0967e8-be30-4658-8663-2a1fd7d9eb53",
        "ActionName": "frontend.values.web.Controllers.HomeController.WriteTrace (frontend.values.web)",
        "RequestId": "0HLLF1A02IS16:00000005",
        "RequestPath": "/Home/WriteTrace",
        "CorrelationId": null,
        "ConnectionId": "0HLLF1A02IS16",
        "ExceptionDetail": {
            "HResult": -2146232832,
            "Message": "Boom!",
            "Source": null,
            "Type": "System.ApplicationException"
        }
    }
}

This is the Program.cs, part of Serilog config (ExceptionAsObjectJsonFormatter inherit from ElasticsearchJsonFormatter): 这是Program.cs,Serilog配置的一部分(ExceptionAsObjectJsonFormatter继承自ElasticsearchJsonFormatter):

.UseSerilog((ctx, config) =>
{
    var shouldFormatElastic = ctx.Configuration.GetValue<bool>("LOG_ELASTICFORMAT", false);
    config
        .ReadFrom.Configuration(ctx.Configuration) // Read from appsettings and env, cmdline
        .Enrich.FromLogContext()
        .Enrich.WithExceptionDetails();

    var logFormatter = new ExceptionAsObjectJsonFormatter(renderMessage: true);
    var logMessageTemplate = "[{Timestamp:HH:mm:ss} {Level:u3}] {Message:lj}{NewLine}{Exception}";

    if (shouldFormatElastic)
        config.WriteTo.Console(logFormatter, standardErrorFromLevel: LogEventLevel.Error);
    else
        config.WriteTo.Console(standardErrorFromLevel: LogEventLevel.Error, outputTemplate: logMessageTemplate);

})

Using these nuget pkgs: 使用这些nuget pkgs:

  • Serilog.AspNetCore Serilog.AspNetCore
  • Serilog.Exceptions Serilog.Exceptions
  • Serilog.Formatting.Elasticsearch Serilog.Formatting.Elasticsearch
  • Serilog.Settings.Configuration Serilog.Settings.Configuration
  • Serilog.Sinks.Console Serilog.Sinks.Console

This is how it looks like in Kibana 这就是它在Kibana中的样子 这里

And this is configmap for fluent-bit: 这是fluent-bit的configmap:

fluent-bit-filter.conf:
[FILTER]
    Name                kubernetes
    Match               kube.*
    Kube_URL            https://kubernetes.default.svc:443
    Kube_CA_File        /var/run/secrets/kubernetes.io/serviceaccount/ca.crt
    Kube_Token_File     /var/run/secrets/kubernetes.io/serviceaccount/token
    Merge_Log           On
    K8S-Logging.Parser  On
    K8S-Logging.Exclude On

fluent-bit-input.conf:
[INPUT]
    Name             tail
    Path             /var/log/containers/*.log
    Parser           docker
    Tag              kube.*
    Refresh_Interval 5
    Mem_Buf_Limit    5MB
    Skip_Long_Lines  On

fluent-bit-output.conf:

[OUTPUT]
    Name  es
    Match *
    Host  elasticsearch
    Port  9200
    Logstash_Format On
    Retry_Limit False
    Type  flb_type
    Time_Key @timestamp
    Replace_Dots On
    Logstash_Prefix kubernetes_cluster




fluent-bit-service.conf:
[SERVICE]
    Flush        1 
    Daemon       Off
    Log_Level    info
    Parsers_File parsers.conf
fluent-bit.conf:
@INCLUDE fluent-bit-service.conf
@INCLUDE fluent-bit-input.conf
@INCLUDE fluent-bit-filter.conf
@INCLUDE fluent-bit-output.conf
parsers.conf:

But I also tried https://raw.githubusercontent.com/fluent/fluent-bit-kubernetes-logging/master/output/elasticsearch/fluent-bit-configmap.yaml with my modifications. 我也尝试了https://raw.githubusercontent.com/fluent/fluent-bit-kubernetes-logging/master/output/elasticsearch/fluent-bit-configmap.yaml

I used Helm to install fluentbit with helm install stable/fluent-bit --name=fluent-bit --namespace=logging --set backend.type=es --set backend.es.host=elasticsearch --set on_minikube=true 我使用Helm来安装带有helm install stable/fluent-bit --name=fluent-bit --namespace=logging --set backend.type=es --set backend.es.host=elasticsearch --set on_minikube=true install的fluentbit helm install stable/fluent-bit --name=fluent-bit --namespace=logging --set backend.type=es --set backend.es.host=elasticsearch --set on_minikube=true

I also get alot of the following errors: 我也收到了很多以下错误:

log:{"took":0,"errors":true,"items":[{"index":{"_index":"kubernetes_cluster-2019.03.22","_type":"flb_type","_id":"YWCOp2kB4wEngjaDvxNB","status":400,"error":{"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"json_parse_exception","reason":"Duplicate field '@timestamp' at [Source: org.elasticsearch.common.bytes.BytesReference$MarkSupportingStreamInputWrapper@432f75a7; line: 1, column: 1248]"}}}}]}

and

log:[2019/03/22 22:38:57] [error] [out_es] could not pack/validate JSON response stream:stderr

as I can see in Kibana. 正如我在Kibana看到的那样。

Problem was bad fluentbit configmap. 问题是糟糕的fluentbit configmap。 This works: 这有效:

apiVersion: v1
kind: ConfigMap
metadata:
  name: fluent-bit-config
  namespace: logging
  labels:
    k8s-app: fluent-bit
data:
  # Configuration files: server, input, filters and output
  # ======================================================
  fluent-bit.conf: |
    [SERVICE]
        Flush         1
        Log_Level     info
        Daemon        off
        Parsers_File  parsers.conf
        HTTP_Server   On
        HTTP_Listen   0.0.0.0
        HTTP_Port     2020        
    @INCLUDE input-kubernetes.conf
    @INCLUDE filter-kubernetes.conf
    @INCLUDE output-elasticsearch.conf
  input-kubernetes.conf: |
    [INPUT]
        Name              tail
        Tag               kube.*
        Path              /var/log/containers/*.log
        Parser            docker
        DB                /var/log/flb_kube.db
        Mem_Buf_Limit     5MB
        Skip_Long_Lines   On
        Refresh_Interval  10
  filter-kubernetes.conf: |
    [FILTER]
        Name                kubernetes
        Match               kube.*
        Kube_URL            https://kubernetes.default.svc:443
        # These two may fix some duplicate field exception
        Merge_Log           On
        Merge_JSON_Key      k8s
        K8S-Logging.Parser  On
        K8S-Logging.exclude True
  output-elasticsearch.conf: |
    [OUTPUT]
        Name            es
        Match           *
        Host            ${FLUENT_ELASTICSEARCH_HOST}
        Port            ${FLUENT_ELASTICSEARCH_PORT}
        Logstash_Format On
        # This fixes errors where kubernetes.apps.name must object
        Replace_Dots    On 
        Retry_Limit     False
        Type            flb_type
        # This may fix some duplicate field exception
        Time_Key        @timestamp_es
        # The Index Prefix:
        Logstash_Prefix logstash_07
  parsers.conf: |
    [PARSER]
        Name   apache
        Format regex
        Regex  ^(?<host>[^ ]*) [^ ]* (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^\"]*?)(?: +\S*)?)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?$
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name   apache2
        Format regex
        Regex  ^(?<host>[^ ]*) [^ ]* (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^ ]*) +\S*)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?$
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name   apache_error
        Format regex
        Regex  ^\[[^ ]* (?<time>[^\]]*)\] \[(?<level>[^\]]*)\](?: \[pid (?<pid>[^\]]*)\])?( \[client (?<client>[^\]]*)\])? (?<message>.*)$
    [PARSER]
        Name   nginx
        Format regex
        Regex ^(?<remote>[^ ]*) (?<host>[^ ]*) (?<user>[^ ]*) \[(?<time>[^\]]*)\] "(?<method>\S+)(?: +(?<path>[^\"]*?)(?: +\S*)?)?" (?<code>[^ ]*) (?<size>[^ ]*)(?: "(?<referer>[^\"]*)" "(?<agent>[^\"]*)")?$
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name   json
        Format json
        Time_Key time
        Time_Format %d/%b/%Y:%H:%M:%S %z
    [PARSER]
        Name        docker
        Format      json
        #Time_Key    time
        Time_Key    @timestamp
        Time_Format %Y-%m-%dT%H:%M:%S.%L
        Time_Keep   Off # on
        # See: https://fluentbit.io/documentation/0.14/parser/decoder.html
        # Command      |  Decoder | Field | Optional Action
        # =============|==================|=================
        # Decode_Field_As   escaped    log
        # Decode_Field_As   escaped    log    do_next
        # Decode_Field_As   json       log     
    [PARSER]
        Name        syslog
        Format      regex
        Regex       ^\<(?<pri>[0-9]+)\>(?<time>[^ ]* {1,2}[^ ]* [^ ]*) (?<host>[^ ]*) (?<ident>[a-zA-Z0-9_\/\.\-]*)(?:\[(?<pid>[0-9]+)\])?(?:[^\:]*\:)? *(?<message>.*)$
        Time_Key    time
        Time_Format %b %d %H:%M:%S

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 Serilog Elasticsearch 日志未发送到服务器(但正确缓冲) - Serilog Elasticsearch logs not making it to the server (but buffering correctly) Serilog 不会将日志发送到 Elasticsearch 8 - Serilog does not send logs to Elasticsearch 8 Fluentbit Kubernetes - 如何从现有日志中提取字段 - Fluentbit Kubernetes - How to extract fields from existing logs ELK堆栈-远程服务器的日志已正确解析和分类,但转储到stout而不是elasticsearch - ELK Stack - Remote servers' logs parsed and classed correctly, but dumped to stout not elasticsearch 如何将日志从 serilog 文件发送到 elasticsearch - How to send logs from serilog file into elasticsearch Kubernetes 事件日志到 elasticsearch - Kubernetes event logs to elasticsearch Elasticsearch日志中没有数据被解析+异常 - No data is being parsed + exception in elasticsearch logs 如何使用流利的kubernetes将不同的应用程序日志提供给Elasticsearch - How to get different application logs to Elasticsearch using fluentd in kubernetes 在Kubernetes上安装Logstash并将日志发送到AWS ElasticSearch - Installing Logstash on Kubernetes and Sending logs to AWS ElasticSearch Serilog 不会将日志写入 AWS Elasticsearch Service - Serilog doesn't write logs into AWS Elasticsearch Service
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM