简体   繁体   English

如何在不使用 Kibana 的情况下直接从 elasticsearch 读取日志

[英]How to read logs directly from elasticsearch without using Kibana

I have a ASP.NET Core Web API written in C# with docker-compose, elasticsearch, and serilog and Kibana. I have a ASP.NET Core Web API written in C# with docker-compose, elasticsearch, and serilog and Kibana. I plan on removing the Kibana from the docker-compose.yml file.我计划从 docker-compose.yml 文件中删除 Kibana。 Once the Serilog generates the log files and after configuring a sink to Elasticsearch so it can write the logs where elasticsearch can read it.一旦 Serilog 生成日志文件并在将接收器配置到 Elasticsearch 之后,它就可以将日志写入 elasticsearch 可以读取的位置。 How do I go about reading those logs that is now in elasticsearch without having to go to Kibana to view the logs and read them?我如何 go 关于阅读现在在 elasticsearch 中的那些日志,而不必 go 到 Kibana 来查看日志并阅读它们?

Is there any recommendations on a documentation and/or a package for this or is this something that needs to be programmed from scratch?对此文档和/或 package 是否有任何建议,或者这是否需要从头开始编程?


Suggestion attempt:建议尝试:

I went to Download Kafka then I went to powershell as an admin and did a wget (url) .我去下载卡夫卡然后我去 powershell 作为管理员并做了一个wget (url) After it downloaded, I ran tar -xzf kafka_2.13-2.8.0.tgz & cd kafka_2.13-2.8.0 .下载后,我运行tar -xzf kafka_2.13-2.8.0.tgz & cd kafka_2.13-2.8.0 I then followed what you advised to Activate Zookeeper broker and Kafka and then creating the topic.然后我按照您的建议激活 Zookeeper 代理和 Kafka,然后创建主题。 However when for each step you said to do, nothing happened.但是,当您说要执行的每个步骤时,什么都没有发生。 I would try to activate zookeeper it would tell me how do I want to open the file, so I would just hit ESC and then ran the other commands but same thing would come up.我会尝试激活 Zookeeper,它会告诉我如何打开文件,所以我只需按 ESC,然后运行其他命令,但会出现同样的情况。 Should this be doing that?这应该这样做吗?

在此处输入图像描述

Make use of log4net as log provide and it's Kafka_Appender .This appender will produce your operation logs in every level to topic and it consumers then Logstash will ingest this logs to your elastic index as output.使用 log4net 作为日志提供,它是Kafka_Appender 。这个附加器将在每个级别生成您的操作日志到主题,然后 Logstash 会将这些日志作为 output 摄取到您的弹性索引中。

There are many privileges in this roadmap, You have supper powerful stream processor like Apache Kafka and its queue based messaging help you to always trace every logs that it produce an other one is Logstash which you can even add more stream processor and filter like grok and have multiplr outputs and even storing you logs as Csv or file system. There are many privileges in this roadmap, You have supper powerful stream processor like Apache Kafka and its queue based messaging help you to always trace every logs that it produce an other one is Logstash which you can even add more stream processor and filter like grok and有多个输出,甚至将您的日志存储为 Csv 或文件系统。

First activate Zookeeper and Kafka broker and create a consumer with some topic name in bin directory of downloaded Kafka file:首先激活 Zookeeper 和 Kafka 代理,并在下载的 Kafka 文件的 bin 目录中创建一个具有一些主题名称的消费者:

Activating Zookeeper broker激活 Zookeeper 代理

./zookeeper-server-start.sh ../config/zookeeper.properties

Activating Kafka broker激活 Kafka 代理

./kafka-server-start.sh ../config/server.properties

Create Topic创建主题

./kafka-topics.sh --create --topic test-topic -zookeeper localhost:2181 --replication-factor 1 --partitions 4

Active consumer of the created topic已创建主题的活跃消费者

./kafka-console-producer.sh --broker-list localhost:9092 --topic test-topic

Then add the log appender for created topic for consuming logs(This one is up to you) and after that create a Logstash pipeline such as below configuration然后为创建的主题添加日志附加器以使用日志(这个由您决定),然后创建一个 Logstash 管道,如下面的配置

 input { kafka{ group_id => "35834" topics => ["yourtopicname"] bootstrap_servers => "localhost:9092" codec => json } } filter { } output { file { path => "C:\somedirectory" } elasticsearch { hosts => ["localhost:9200"] document_type => "_doc" index => "yourindexname" } stdout { codec => rubydebug } }

And run it with popular command in the bin directory of logstash并在logstash的bin目录下用常用命令运行

./logstash -f yourconfigurationfile.conf

Please note that to create an index before activating Logstash more over you do not need to design mapping for your output index as soon as first document inserted elastic will creates a mapping for all relevant fields in your index.请注意,要在激活 Logstash 之前创建索引,您无需为 output 索引设计映射,只要插入第一个文档,弹性将为索引中的所有相关字段创建映射。

You can use one of the two official clients for elasticsearch using .NET您可以使用 elasticsearch 使用 .NET 的两个官方客户端之一

There is a low level and high level client, you can read more about the difference and how to use each one in the official documentation .有低级和高级客户端,您可以在官方文档中阅读更多关于差异以及如何使用它们的信息。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何使用serilog通过身份验证将日志设置为kibana中的ELK - How to set logs to ELK in kibana with authentication using serilog 如何从外部防火墙读取日志 - How to read Logs from external firewall 如何直接从流中读取二进制数组? - How to read binary array directly from the stream? 如何使用它的指针直接从托管代码中的非托管读取矢量数据 - how to directly read a vector data from unmanaged in managed code using it's pointer 如何使用ZXing直接从移动摄像头读取QR码[ASP.Net WebForm] - How to read a QR code directly from a mobile camera using ZXing [ASP.Net WebForm] 如何直接从 datagridview 单元格更新 SQL 记录,而不使用文本框? - How to update SQL record directly from datagridview cell, without using textbox? 如何直接从DLL访问SQLite数据库(不使用app.config)? - How do I access SQLite database directly from DLL (without using app.config)? 无法使用硒从IE控制台选项卡读取日志消息 - Unable to read logs message from IE console tab using selenium 从特定类型删除旧的Elasticsearch日志 - Delete old elasticsearch logs from a specific type 如何在不使用C#的情况下从LocalFolder读取图像? - How to read image from LocalFolder without using await in C#?
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM