简体   繁体   English

将appengine java上的日志写入stackdriver-logging

[英]writing logs on appengine java to stackdriver-logging

I have a java app running on appengine. 我有一个在appengine上运行的java应用程序。

I'm logging my logs in json structure and then I can see my logs on stack driver (as in the docs ) 我在json结构中记录我的日志然后我可以在堆栈驱动程序上看到我的日志(如在文档中

package com.foo.bar;

public class MyClass {

    private static final Logger log = Logger.getLogger(MyClass.class.getName());

    public void myFunc() {
        log.info("{msg: 'hello', corId: '123'}");
    }

here is the message I get on stackdriver-logging: 这是我在stackdriver-logging上得到的消息:

com.foo.bar.MyClass myFunc: {msg: 'hello', corId: '123'}

and in the log-request object: 并在日志请求对象中:

protoPayload.line[].logMessage = "com.foo.bar.MyClass myFunc: {msg: 'hello', corId: '123'}"

how can I make the log message to be only the message I am logging - without the class prefix: {msg: 'hello', corId: '123'} 如何使日志消息只是我正在记录的消息 - 没有类前缀: {msg: 'hello', corId: '123'}

protoPayload.line[].logMessage = "{msg: 'hello', corId: '123'}"

I ended up shipping the logs from stackdriver to alasticsearch via logstash 我最终通过logstash将日志从stackdriver发送到alasticsearch

in logstash I parsed my logs, I also seperated my logs tol have each log as its own record and not a nested array see: How to ship logs from pods on Kubernetes running on top of GCP to elasticsearch/logstash? 在logstash中我解析了我的日志,我还分离了我的日志,将每个日志作为自己的记录而不是嵌套数组请参阅: 如何将运行在GCP之上的Kubernetes上的pod中的日志发送到elasticsearch / logstash?

my config in logstash for parsing the logs: 我在logstash中的配置用于解析日志:

filter {
  if [resource][type] == "gae_app" {
    # split the protoPayload.line array, so each log message is a separate entry in Elasticsearch
    split {
      field => "[protoPayload][line]"
      target => "line"
      remove_field => [  "httpRequest", "operation", "protoPayload"]
    }
    # extract `line.logMessage` and `line.severity` fields
    mutate {
      add_field => {"logMessage" => "%{[line][logMessage]}"}
      replace => {"severity" => "%{[line][severity]}"}
      remove_field => ["line"]
    }
    # remove the `com.example.MyClass myFunc: ` prefix from log
    grok {
      match => { "logMessage" => "^%{DATA}: %{GREEDYDATA:parsedMessage}"}
    }
    # parse the log message into json, json fields will be located in root
    json {
      source => "parsedMessage"
      target => "jsonPayload"
      add_field => {"[jsonPayload][level]" => "%{severity}"}
      remove_field => ["parsedMessage", "logMessage"]
    }

    # uniform GAE logs to the structure of GKE logs
    grok {
      match => {  # check..
        "[resource][labels][version_id]" =>
        "^%{DATA:[resource][labels][container_name]}-%{GREEDYDATA:[resource][labels][namespace_id]}"}
    }


  }
}

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM