简体   繁体   English

如何使用 Logstash 解析来自 S3 的数据并推送到 Elastic Search,然后推送到 Kibana

[英]How to parse data from S3 using Logstash and push to Elastic Search and then to Kibana

I have a log file created in S3 bucket every minute.我每分钟在 S3 存储桶中创建一个日志文件。 The data is "\\x01" delimited.数据以“\\x01”分隔。 One of the column is a timestamp field.其中一列是时间戳字段。

I want to load this data into elastic search.我想将此数据加载到弹性搜索中。

I tried using the following logstash conf.我尝试使用以下logstash conf。 But it doesn't seem to work.但它似乎不起作用。 I don't see any output.我没有看到任何输出。 I took some reference from http://brewhouse.io/blog/2014/11/04/big-data-with-elk-stack.html我从http://brewhouse.io/blog/2014/11/04/big-data-with-elk-stack.html 获取了一些参考

Logstash config file is as follows: Logstash 配置文件如下:

input {
  s3 {
    bucket => "mybucketname"
    credentials => [ "accesskey", "secretkey" ]
  }
}
filter {
  csv {
   columns => [ "col1", "col2", "@timestamp" ]
   separator => "\x01"
  }
}
output {
  stdout { } 
}

How do I modify this file to take in new file coming in every minute?如何修改此文件以接收每分钟传入的新文件?

I would then eventually want to connect Kibana to ES to visualize the changes.然后我最终希望将 Kibana 连接到 ES 以可视化更改。

Just use logstash-forwarder to send the files from S3, you will have to generate certificates for authorization.只需使用 logstash-forwarder 从 S3 发送文件,您将必须生成证书进行授权。

There is a really nice tutorial: https://www.digitalocean.com/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-logs-on-centos-7有一个非常好的教程: https : //www.digitalocean.com/community/tutorials/how-to-use-logstash-and-kibana-to-centralize-logs-on-centos-7

if you getting I/O errors, mb you can solve them by setting cluster:如果您收到 I/O 错误,您可以通过设置集群来解决它们:

inside logstash.conf:在 logstash.conf 中:

output {
    elasticsearch {
        host => "127.0.0.1"
        cluster => CLUSTER_NAME
    }

inside elasticsearch.yml:在 elasticsearch.yml 中:

cluster.name: CLUSTER_NAME

if you getting problems generating certificates, you can generate them using this: https://raw.githubusercontent.com/driskell/log-courier/develop/src/lc-tlscert/lc-tlscert.go如果您在生成证书时遇到问题,可以使用以下方法生成它们: https : //raw.githubusercontent.com/driskell/log-courier/develop/src/lc-tlscert/lc-tlscert.go

I also found better init.d for logstash-forwarder on CentOS: http://smuth.me/posts/centos-6-logstash-forwarder-init-script.html我还在 CentOS 上为 logstash-forwarder 找到了更好的 init.d: http ://smuth.me/posts/centos-6-logstash-forwarder-init-script.html

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

相关问题 如何从 Logstash 将数据推送到现有索引(弹性搜索)? - How to push data to an existing index(Elastic Search) from Logstash? 是否可以在不使用 Logstash 或 FileBeats 或 Kibana 的情况下将数据索引到 Elastic Search 中? - Is is possible to index data into Elastic Search without using Logstash or FileBeats or Kibana? 使用 Glue 作业将数据从 Elastic 搜索导出到 S3 - Export data from Elastic search to S3 using Glue job filebeat 没有在 ec2 上使用弹性搜索和 kibana 运行,没有使用 logstash - filebeat not running with elastic search and kibana on ec2 not using logstash 使用Python中的Logstash将数据导入Elastic Search - Import data into Elastic Search using Logstash in Python 使用logstash和弹性搜索进行自定义解析 - custom parse with logstash and elastic search 如何对弹性堆栈进行维度和架构(使用 Beats、Logstash、Elastic Search、Kibana)? (最佳实践) - How to dimension & architect an Elastic stack (with Beats, Logstash, Elastic Search, Kibana)? (best practices) 如何将弹性搜索查询从Kibana 3转换为Kibana 4 - How to convert elastic search queries from Kibana 3 to Kibana 4 如何通过弹性搜索将性能测试日志推送到kibana - How to push performance test logs to kibana via elastic search 在ELK中标准化日志数据-Elastic Logstash Kibana - Standardize log data in ELK - Elastic Logstash Kibana
 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM