简体   繁体   English

Json Logstash与TCP和Elastic

[英]Json logstash with tcp and elastic

i looking for many many time and also consult in here to found one solution to resolve problem json in elastic using logstash. 我寻找了很多时间,并在这里咨询以找到一种解决方案,以使用logstash解决弹性中的问题json。 my config here 我的配置在这里

input {
  tcp {
    port => 9000
  }
}
filter{
    json{
        source => "message"
        target => "doc"
    }
}
output {
  elasticsearch {
   hosts => ["localhost:9200"] 
   index => logstash-%{+YYYY.MM.dd}
  }
}

but my elastic still have string message document not json. 但是我的弹性仍然有字符串消息文档而不是json。 Document like that 像这样的文件

{
  "_index": "logstash-2017.05.12",
  "_type": "logs",
  "_id": "AVv8C4O4qok70-ifTOnm",
  "_score": null,
  "_source": {
    "message": "{\"name\":\"abc\",\"id\":1494582167248}",
    "@version": "1",
    "@timestamp": "2017-05-12T09:42:47.263Z",
    "host": "172.0.0.1",
    "port": 53763
  },
  "fields": {
    "@timestamp": [
      1494582167263
    ]
  },
  "sort": [
    1494582167263
  ]
}

Any one can help me how to fix inorder to name and id filed is as an member propeter of _source . 作为_source的成员支持者,任何人都可以帮助我解决如何更改名称ID字段的问题。 Im expect document log like 我希望文档日志像

"_source": {
    "name":"abc",
    "id": 1494582167248
    "@version": "1",
    "@timestamp": "2017-05-12T09:42:47.263Z",
    "host": "192.168.2.251",
    "port": 53763
  }

If you want to have those fields at the root of the parsed message (which will be at the root level of _source in ElasticSearch, you must remove the JSON target setting. That setting specifies a parent to the parsed fields that are extracted using the JSON filter: 如果要将这些字段放在已解析消息的根目录(将在ElasticSearch的_source的根目录级别),则必须删除JSON target设置。该设置为使用JSON提取的已解析字段指定父级过滤:

{
  "@timestamp": "2017-05-12T11:58:40.897Z",
  "port": 61981,
  "@version": "1",
  "host": "10.0.2.2",
  "doc": {
    "name": "abc",
    "id": 1494582167248
  },
  "message": "{\"name\":\"abc\",\"id\":1494582167248}"
}

So remove the target setting, and make sure that your index setting is set in quotes: 因此,删除target设置,并确保将index设置设置为引号:

input {
    tcp {
        port => 9000
    }
}

filter{
    json {
        source => "message"
    }
}

output {
    stdout { codec => "rubydebug" }
    elasticsearch {
        hosts => ["localhost:9200"]
        index => "logstash-%{+YYYY.MM.dd}"
    }
}

Which results in: 结果是:

{
  "@timestamp": "2017-05-12T11:56:51.187Z",
  "port": 61970,
  "@version": "1",
  "host": "10.0.2.2",
  "name": "abc",
  "id": 1494582167248,
  "message": "{\"name\":\"abc\",\"id\":1494582167248}"
}

To validate with ElasticSearch curl -XGET localhost:9200/logstash-2017.05.12/_search | jq . 要使用ElasticSearch进行验证curl -XGET localhost:9200/logstash-2017.05.12/_search | jq . curl -XGET localhost:9200/logstash-2017.05.12/_search | jq . returns 回报

{
  "_index": "logstash-2017.05.12",
  "_type": "logs",
  "_id": "AVv8hMTCjLo8wwWpi9R6",
  "_score": 1,
  "_source": {
    "@timestamp": "2017-05-12T11:56:51.187Z",
    "port": 61970,
    "@version": "1",
    "host": "10.0.2.2",
    "name": "abc",
    "id": 1494582167248,
    "message": "{\"name\":\"abc\",\"id\":1494582167248}"
  }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM