简体   繁体   English

如何使用 node.js 设置 ELK

[英]How to setup ELK with node.js

I want to log error from my node.js server to another server.我想将错误从我的 node.js 服务器记录到另一台服务器。 I'm thinking of using elasticsearch, logstash and kibana.我正在考虑使用 elasticsearch、logstash 和 kibana。 I want to know how to setup ELK with my node server.我想知道如何使用我的节点服务器设置 ELK。

I had exactly this use case in my older organization.在我的旧组织中,我正好有这个用例。 A basic tutorial to startup with Beats + ELK - https://www.elastic.co/guide/en/beats/libbeat/current/getting-started.html使用 Beats + ELK 启动的基本教程 - https://www.elastic.co/guide/en/beats/libbeat/current/getting-started.html

So basically this is how it works - Your nodejs app will log in the files (you can use bunyan for this) in different formats like error/warning/info etc. Filebeat will tail these log files and send messages to logstash server.所以基本上这就是它的工作原理 - 您的 nodejs 应用程序将以不同的格式(例如错误/警告/信息等)登录文件(您可以使用 bunyan)。Filebeat 将跟踪这些日志文件并将消息发送到 logstash 服务器。 Logstash input.conf will have some input filters (in your case it will be error filters). Logstash input.conf 将有一些输入过滤器(在您的情况下它将是错误过滤器)。 When any log message passes these filters then logstash will forward it to some endpoint as decided in output.conf file.当任何日志消息通过这些过滤器时,logstash 会将其转发到 output.conf 文件中决定的某个端点。

Here is what we did -这是我们所做的 -

Initial architecture - Install filebeat (earlier we used logstash forwarder) client to tail the logs on nodejs server and forward it to logstash machine.初始架构 - 安装 filebeat(之前我们使用 logstash 转发器)客户端来跟踪 nodejs 服务器上的日志并将其转发到 logstash 机器。 Logstash will do some processing on input logs and send them to ES cluster (can be on same machine as Logstash). Logstash 会对输入的日志做一些处理并发送给 ES 集群(可以和 Logstash 在同一台机器上)。 Kibana is just a visualization on this ES. Kibana 只是这个 ES 上的一个可视化。

Final Architecture - Initial setup was cool for small traffic but we realized that logstash can be single point of failure and may result in log loss when traffic increased.最终架构 - 初始设置对于小流量来说很酷,但我们意识到 logstash 可能是单点故障,并且可能会在流量增加时导致日志丢失。 So we integrated Kafka along with Logstash so that system scales smoothly.因此我们将 Kafka 与 Logstash 集成在一起,以便系统平滑扩展。 Here is an article - https://www.elastic.co/blog/logstash-kafka-intro这是一篇文章 - https://www.elastic.co/blog/logstash-kafka-intro

Hope this helps!希望这有帮助!

It is possible to use logstash without agents running to collect logs from the application.可以在不运行代理的情况下使用 logstash 从应用程序收集日志。 Logstash has input plugins ( https://www.elastic.co/guide/en/logstash/current/input-plugins.html ). Logstash 有输入插件( https://www.elastic.co/guide/en/logstash/current/input-plugins.html )。 This can be configured in the pipeline.这可以在管道中配置。 One basic setup is to configure the TCP ( https://www.elastic.co/guide/en/logstash/current/plugins-inputs-tcp.html ) or UDP ( https://www.elastic.co/guide/en/logstash/current/plugins-inputs-udp.html)input plugin.一种基本设置是配置 TCP ( https://www.elastic.co/guide/en/logstash/current/plugins-inputs-tcp.html ) 或 UDP ( https://www.elastic.co/guide/ en/logstash/current/plugins-inputs-udp.html)输入插件。 Logstash can listen on the port configured in the plugin. Logstash 可以监听插件中配置的端口。 Then the application can send the log directly to the logstash server.然后应用程序可以将日志直接发送到logstash 服务器。 The pipeline can then transform and forward to ES.然后管道可以转换并转发到 ES。 By configuring Logstash pipeline to be durable, data loss can be avoided.通过将 Logstash 管道配置为持久的,可以避免数据丢失。 This approach is better when the application servers are ephemeral ( as in containers).当应用程序服务器是短暂的(如在容器中)时,这种方法更好。

For nodejs, https://www.npmjs.com/package/winston-logstash is a package which is quite active.对于 nodejs, https ://www.npmjs.com/package/winston-logstash 是一个非常活跃的包。 This gist https://gist.github.com/jgoodall/6323951 provides a good example for the overall approach in other languages.这个要点https://gist.github.com/jgoodall/6323951为其他语言的整体方法提供了一个很好的例子。

This is the sample (minimal) TCP input plugin configuration这是示例(最小)TCP 输入插件配置

input {
  tcp {
  'port' => '9563'
  }
}

You can install Logstash in the NodeJS Server, and then create a configuration file that accepts input (location of the log file(s)) and output to your Elastic Server host.您可以在 NodeJS 服务器中安装 Logstash,然后创建一个接受输入(日志文件的位置)和输出到您的 Elastic Server 主机的配置文件。

Below is the sample configuration file (custom.conf) which has to created in your logstash directory.下面是必须在您的 logstash 目录中创建的示例配置文件 (custom.conf)。

    input {
    file {
    path => "/path to log"
    start_position => beginning
    }
    }
    output {
    stdout { }
    elasticsearch{
    type => "stdin-type"
    embedded => false
    host => "192.168.0.23"
    port => "9300"
    cluster => "logstash-cluster"
    node_name => "logstash"
    }
    }

Execute the logstash执行日志存储

    logstash -f custom.conf

Reference: https://www.elastic.co/guide/en/logstash/current/config-examples.html参考: https : //www.elastic.co/guide/en/logstash/current/config-examples.html

If you are planning to customize a NodeJS application for sending error logs then you can install some ELKStack Nodjs wrapper and post error log within your application.如果您计划自定义 NodeJS 应用程序以发送错误日志,那么您可以安装一些 ELKStack Nodjs 包装器并在您的应用程序中发布错误日志。 ELKStack Nodjs wrapper - https://www.npmjs.com/package/elksdk ELKStack Nodjs 包装器 - https://www.npmjs.com/package/elksdk

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM