简体   繁体   English

Spring Boot 过滤器没有过滤我所有的日志

[英]Spring Boot Filter not filtering all my logs

I have a Filter set up in my Spring Boot app to add some information to the MDC:我在 Spring Boot 应用程序中设置了一个过滤器来向 MDC 添加一些信息:

@Component
public class LogFilter implements Filter {

    @Override
    public void init(FilterConfig var1) throws ServletException {}

    @Override
    public void doFilter(ServletRequest request, ServletResponse response, FilterChain chain) throws IOException, ServletException {

        try {
            MDC.put("tag", "Some information);
            chain.doFilter(request, response);
        } finally {
            MDC.clear();
        }
    }
    @Override
    public void destroy() { }
}

This works fine for most of my application, but for certain operations where a thread is spawned, this filter is not picking up those messages.这对于我的大多数应用程序来说都很好,但是对于生成线程的某些操作,此过滤器不会接收这些消息。

For example, in the block below, the callback methods occur in a separate thread, so the first log.info call is getting picked up by my LogFilter , but the log.info and log.error in my callbacks aren't.例如,在下面的块中,回调方法发生在一个单独的线程中,所以第一个log.info调用被我的LogFilter ,但我的回调中的log.infolog.error不是。

private void publishMessage(String message) {
    log.info("Message received. Sending to Kafka topic");
    CompletableFuture<ListenableFuture<SendResult<String, String>>> future = CompletableFuture.supplyAsync(() -> kafkaTemplate.send("myTopic", message));

    try {
        future.get().addCallback(new ListenableFutureCallback<SendResult<String, String>>() {

            @Override
            public void onSuccess(SendResult<String, String> result) {
                log.info("Kafka topic " + myTopic + " published to successfully");
            }

            @Override
            public void onFailure(Throwable ex) {
                log.error("Kafka error: " + ex.getMessage());
            }
        });
        } catch (Exception e) {
            log.error("Kafka has failed you for the last time");
        }
    }

In general, it seems like any log event that doesn't occur in one of the http-nio-8080-exec-X threads bypasses LogFilter .一般来说,似乎没有在http-nio-8080-exec-X线程之一中发生的任何日志事件都会绕过LogFilter What am I doing wrong?我究竟做错了什么?

Things I've tried that haven't worked:我尝试过但没有奏效的事情:

  1. Having LogFilter extend GenericFilterBean , use @Bean instead of @Component , and then registering that bean with FilterRegistrationBean in my main Application classLogFilter延长GenericFilterBean ,使用@Bean代替@Component ,然后注册这个bean与FilterRegistrationBean在我的主应用程序类
  2. Using @WebFilter(urlPatterns = {"/*"}, description = "MDC Filter") and/or @ServletComponentScan使用@WebFilter(urlPatterns = {"/*"}, description = "MDC Filter")和/或@ServletComponentScan

MDC context is only available for the current running thread but your callback will be called within a different thread. MDC 上下文仅适用于当前正在运行的线程,但您的回调将在不同的线程中调用。

one way to deal with it is to implement ListenableFutureCallback :处理它的一种方法是实现ListenableFutureCallback

private static class MyListenableFutureCallback
              implements ListenableFutureCallback<SendResult<String, String>> {

            private Map<String,String> contextMap = MDC.getCopyOfContextMap();

            @Override
            public void onSuccess(SendResult<String, String> result) {
                MDC.setContextMap(contextMap); //add MDC context here
                log.info("Kafka topic " + myTopic + " published to successfully");
            }

            @Override
            public void onFailure(Throwable ex) {
                MDC.setContextMap(contextMap); //add MDC context here
                log.error("Kafka error: " + ex.getMessage());
            }
    }

and finally:最后:

future.get().addCallback(new MyListenableFutureCallback()).

a more consistant way of doing this is described Here这里描述了一种更一致的方法

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM