简体   繁体   English

记录到 Amazon S3

[英]Logging to Amazon S3

Has anyone built, or does anyone know how to build, a logging framework that deposits the log files on Amazon S3?有没有人构建过,或者有人知道如何构建一个将日志文件存放在 Amazon S3 上的日志记录框架?

We're building a SaaS app, and, naturally, we plan to have a zillion servers and customers.我们正在构建一个 SaaS 应用程序,自然地,我们计划拥有无数的服务器和客户。 I'd like to simplify the entire logging structure.我想简化整个日志记录结构。 At the moment we use SLF4j for logging and Logback as the actual logging implementation.目前,我们使用 SLF4j 进行日志记录,并使用 Logback 作为实际的日志记录实现。

I'm thinking that we could drop in another implementation that would accumulate log messages in memory, compress them, and then upload them to S3.我在想我们可以放弃另一个实现,它会在 memory 中累积日志消息,压缩它们,然后将它们上传到 S3。 If the S3 connection were down or slow, the implementation would queue the files up on disk.如果 S3 连接断开或变慢,实施会将文件排队到磁盘上。 Kind of like Facebook's Scribe.有点像 Facebook 的抄写员。

My guess is that it would take me four or five days to write and test this.我的猜测是我需要四五天的时间来编写和测试它。 If there's another implementation out there, I'd love to know about it.如果那里有另一种实现,我很想知道它。

There is a plugin for fluentd that stores files to s3. fluentd 有一个插件,可以将文件存储到 s3。 (Fluentd is a nice "log file collector") (Fluentd 是一个不错的“日志文件收集器”)

Read more about it here: https://docs.fluentd.org/output/s3在此处阅读更多信息: https://docs.fluentd.org/output/s3

If the s3 connection is down or slow it will buffer the output for you.如果 s3 连接断开或变慢,它将为您缓冲 output。

A answer from 8 years in the future. 8年后的答案。 There is no need to write a whole framework for this.没有必要为此编写一个完整的框架。 Instead, you can try to write a custom "appender" for logback or log4j2 and use this appender in respective configuration.相反,您可以尝试为 logback 或 log4j2 编写自定义“附加程序”,并在相应的配置中使用此附加程序。 This way, you don't have to write a entire logging framework, but only the part you need and use the rest from a working framework.这样,您不必编写整个日志框架,而只需编写您需要的部分并使用工作框架中的 rest。

There are also a few of them on github: this is the first one that popped out when I googled https://github.com/shuwada/logback-s3在 github 上也有一些:这是我在谷歌上搜索https 时弹出的第一个://github.com/shuwada/logback-s3

I was looking for something similar.我正在寻找类似的东西。 I'm using winston.js for logging purposes already but I found this plugin that let's you save your logs to AWS S3.我已经在使用 winston.js 进行日志记录,但我发现这个插件可以让您将日志保存到 AWS S3。 ( Winston-S3 ) I haven't tried it yet but I will do shortly. ( Winston-S3 ) 我还没有尝试过,但我会尽快尝试。

It shouldn't be to difficult to show those logs in some admin page.在某些管理页面中显示这些日志应该不难。

You could log to Datadog and then configure log archiving .您可以登录到Datadog ,然后配置日志归档 For example, you could have containers output logs to stdout as JSON and then have the Datadog agent or fluentd/fluentbit forward logs those logs to Datadog.例如,您可以将容器 output 作为 JSON 记录到标准输出,然后让 Datadog 代理或 fluentd/fluentbit forward 将这些日志记录到 Datadog。 Datadog would automatically parse and index the logs for rapid searching. Datadog 会自动解析和索引日志以进行快速搜索。 Different logging formats also work if you write your own parsing rules.如果您编写自己的解析规则,则不同的日志记录格式也适用。 At the end of retention period if you have log archiving set it will automatically upload them to S3 for you.在保留期结束时,如果您设置了日志存档,它将自动为您将它们上传到 S3。

The disadvantage is being locked into Datadog and the price.缺点是被锁定在 Datadog 和价格上。 The advantage is you can easily re-hydrate the logs back into Datadog for fast searching and you don't have to worry about maintaing a self-hosted solution.优点是您可以轻松地将日志重新水合到 Datadog 中以进行快速搜索,并且您不必担心维护自托管解决方案。

If you want an open-source alternative you could try out Loki .如果你想要一个开源的替代品,你可以试试Loki It has a S3 storage backend.它有一个 S3 存储后端。

I was searching google for the same question.我在谷歌搜索同样的问题。 But apparently I am slightly better off.但显然我的情况稍微好一些。 I know how to log to S3.我知道如何登录到 S3。 There is no out-of-the-box solution.没有开箱即用的解决方案。

I suggest something like fuse for s3 to mount your syslog: https://github.com/s3fs-fuse/s3fs-fuse我建议使用 s3 之类的 fuse 来安装您的系统日志: https://github.com/s3fs-fuse/s3fs-fuse

Now all you need is to mount it also in your log-parser system which can be any of the off-the-shelf system that can read logs off of a directory.现在您所需要的只是将它也安装在您的日志解析器系统中,该系统可以是任何可以从目录读取日志的现成系统。

This is what I came up with.这就是我想出的。

What I am still searching for before implementing is the performance issue of such logging to S3 since AWS has it's own issues.在实施之前我仍在寻找的是这种日志记录到 S3 的性能问题,因为 AWS 有它自己的问题。 There was a discussion on setting block/file size to improve performance and lowering read/write cost.讨论了设置块/文件大小以提高性能和降低读/写成本。 Hope it helps another lost soul.希望它可以帮助另一个迷失的灵魂。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM