简体   繁体   English

如何在 Flink 作业中执行日志记录?

[英]How to perform logging in Flink jobs?

I am working on a very simple use case where i want to check the data in a DataStream .我正在研究一个非常简单的用例,我想检查DataStream中的数据。 I would like to understand if there is a better way of logging.Because the below way of logging looks very ugly and adds an extra stage.我想了解是否有更好的日志记录方式。因为下面的日志记录方式看起来很丑陋并增加了一个额外的阶段。

DataStream<Conversation> stream = env.addSource(kafkaConsumer.getKafkaConsumer());
        stream.map(conversation -> {
            logger.info("Read from Kafka source conversationId: {} and content: {}",conversation.id,conversation.time);
            return conversation;
        });

Maybe you can implement a mapfunction class, and then print the log in the class或许你可以实现一个map函数class,然后打印class中的日志

DataStream<Conversation> stream = env.addSource(kafkaConsumer.getKafkaConsumer());
            stream.map(new MyMapFunction()
            });

public class MyMapFunction extends RichMapFunction<T> {
    @Override
    public void open(Configuration parameters) throws Exception {
    }

    @Override
    public T map(...) throws Exception {
        logger.info(xxxx);
        return xxx;
    }
}

Also you can use printsink directly您也可以直接使用 printsink

DataStream<Conversation> stream = env.addSource(kafkaConsumer.getKafkaConsumer());
        stream.print()

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM