简体   繁体   English

如何让php api以二进制格式登录kafka

[英]How to get php api logs into kafka in binary format

Let's assume we have an PHP api which runs under apache webserver.假设我们有一个在 apache 网络服务器下运行的 PHP api。 I'd like to ask how it would be possible to put logs from PHP to kafka in a structured format (let's say avro or protobuf because it would transfer specific business contracts, not just a pure apache logs).我想问一下如何以结构化格式将日志从 PHP 放入 kafka(假设是 avro 或 protobuf,因为它会传输特定的业务合同,而不仅仅是纯 apache 日志)。 Php is not in my sphere or expertise so it might be dumb, but the solution could be just to send messages to kafka from php itself. Php 不属于我的领域或专业知识,所以它可能很愚蠢,但解决方案可能只是从 php 本身向 kafka 发送消息。 But given that it runs under webserver most likely it would have to create connections to kafka for every call.但是考虑到它在网络服务器下运行,它很可能必须为每次调用都创建与 kafka 的连接。 Another possible solution would be to log to file and then use something like filebeat to put logs to kafka.另一种可能的解决方案是登录到文件,然后使用类似 filebeat 的东西将日志放入 kafka。 But I'm not sure how it would work if we used, as I've already mentioned, structured format.但我不确定如果我们使用(正如我已经提到的)结构化格式,它会如何工作。 I doubt filebeat would be able to read that binary log file.我怀疑 filebeat 是否能够读取该二进制日志文件。 Please, let me know if it can.请让我知道是否可以。 So my question is - what is the best way to get php specific logs into kafka in a rigid and scalable manner?所以我的问题是 - 以严格和可扩展的方式将特定于 php 的日志输入 kafka 的最佳方法是什么? Thanks in advance.提前致谢。

Filebeat would not read binary logs. Filebeat 不会读取二进制日志。 It would be used to parse your plaintext messages into some more structured/binary format before sending to Kafka (I'm not certain Filebeat's Kafka serialization is pluggable, though)它将用于在发送到 Kafka 之前将您的纯文本消息解析为某种更结构化/二进制格式(不过,我不确定 Filebeat 的 Kafka 序列化是否可插入)

On the other hand, yes, there's a php-librdkafka library that you can try, but as you said - it'd require some persistent connection to Kafka, compared to having a type of "two-commit" system with files in that every line of the file tracks progress and can withstand process restarts due to Kafka connection errors, for example另一方面,是的,您可以尝试一个 php-librdkafka 库,但正如您所说 - 它需要与 Kafka 的一些持久连接,与具有“两次提交”类型的系统相比,每个文件中都有文件文件的行跟踪进度并且可以承受由于 Kafka 连接错误导致的进程重新启动,例如

Fluentbit or rsyslog or (ideally if you are truly log processing) Splunk forwarder also work... Fluentbit 或 rsyslog 或(理想情况下,如果您是真正的日志处理)Splunk 转发器也可以工作...

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM