簡體   English   中英

無法將事件從Log4J導入Flume

[英]Not able to get the events from Log4J into Flume

我正在嘗試使用Log4J Flume附加程序通過Flume將事件從Log4J 1x導入HDFS。 創建了兩個追加程序FILE和flume。 它適用於FILE附加程序,但是使用flume附加程序,程序僅在Eclipse中掛起。 Flume可以正常工作,我可以使用avro客戶端將消息發送到avro源,並在HDFS中查看消息。 但是,它沒有與Log4J 1x集成在一起。

除了log.out中的以下內容,我看不到任何異常。

Batch size string = null
Using Netty bootstrap options: {tcpNoDelay=true, connectTimeoutMillis=20000}
Connecting to localhost/127.0.0.1:41414
[id: 0x52a00770] OPEN

並從Flume控制台

2013-10-23 14:32:32,145 (pool-5-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x577cf6e4, /127.0.0.1:46037 => /127.0.0.1:41414] OPEN
2013-10-23 14:32:32,148 (pool-6-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x577cf6e4, /127.0.0.1:46037 => /127.0.0.1:41414] BOUND: /127.0.0.1:41414
2013-10-23 14:32:32,148 (pool-6-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x577cf6e4, /127.0.0.1:46037 => /127.0.0.1:41414] CONNECTED: /127.0.0.1:46037
2013-10-23 14:32:43,086 (pool-6-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x577cf6e4, /127.0.0.1:46037 :> /127.0.0.1:41414] DISCONNECTED
2013-10-23 14:32:43,096 (pool-6-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x577cf6e4, /127.0.0.1:46037 :> /127.0.0.1:41414] UNBOUND
2013-10-23 14:32:43,096 (pool-6-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x577cf6e4, /127.0.0.1:46037 :> /127.0.0.1:41414] CLOSED
2013-10-23 14:32:43,097 (pool-6-thread-1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.channelClosed(NettyServer.java:209)] Connection to /127.0.0.1:46037 disconnected.

如果有幫助,我確實可以在調試模式下運行程序,並且在程序掛起時,我進行了掛起並進行了堆棧跟蹤。 試圖查看代碼,但不確定為什么該程序會與水槽附加器掛起。

Daemon Thread [Avro NettyTransceiver I/O Worker 1] (Suspended)  
Logger(Category).callAppenders(LoggingEvent) line: 205  
Logger(Category).forcedLog(String, Priority, Object, Throwable) line: 391  
Logger(Category).log(String, Priority, Object, Throwable) line: 856  
Log4jLoggerAdapter.debug(String) line: 209  
NettyTransceiver$NettyClientAvroHandler.handleUpstream(ChannelHandlerContext, ChannelEvent) line: 491  
DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline$DefaultChannelHandlerContext, ChannelEvent) line: 564  
DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(ChannelEvent) line: 792  
NettyTransportCodec$NettyFrameDecoder(SimpleChannelUpstreamHandler).channelBound(ChannelHandlerContext, ChannelStateEvent) line: 166  
NettyTransportCodec$NettyFrameDecoder(SimpleChannelUpstreamHandler).handleUpstream(ChannelHandlerContext, ChannelEvent) line: 98  
DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline$DefaultChannelHandlerContext, ChannelEvent) line: 564  
DefaultChannelPipeline.sendUpstream(ChannelEvent) line: 559  
Channels.fireChannelBound(Channel, SocketAddress) line: 199  
NioWorker$RegisterTask.run() line: 191  
NioWorker(AbstractNioWorker).processRegisterTaskQueue() line: 329  
NioWorker(AbstractNioWorker).run() line: 235  
NioWorker.run() line: 38  
DeadLockProofWorker$1.run() line: 42  
ThreadPoolExecutor.runWorker(ThreadPoolExecutor$Worker) line: 1145  
ThreadPoolExecutor$Worker.run() line: 615  
Thread.run() line: 744

這是Java程序

import java.io.IOException;
import java.sql.SQLException;
import org.apache.log4j.Logger;
public class log4jExample {
    static Logger log = Logger.getRootLogger();
    public static void main(String[] args) throws IOException, SQLException {
       log.debug("Hello this is an debug message");
    }
}

這是log4j.properties

# Define the root logger with appender file
log = /home/vm4learning/WorkSpace/BigData/Log4J-Example/log
log4j.rootLogger = DEBUG, FILE, flume

# Define the file appender
log4j.appender.FILE=org.apache.log4j.FileAppender
log4j.appender.FILE.File=${log}/log.out
log4j.appender.FILE.layout=org.apache.log4j.PatternLayout
log4j.appender.FILE.layout.conversionPattern=%m%n

# Define the flume appender
log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname = localhost
log4j.appender.flume.Port = 41414
log4j.appender.flume.UnsafeMode = false
log4j.appender.flume.layout=org.apache.log4j.PatternLayout
log4j.appender.flume.layout.ConversionPattern=%m%n

這是Eclipse中的依賴項

flume-ng-log4jappender-1.4.0.jar
log4j-1.2.17.jar
flume-ng-sdk-1.4.0.jar
avro-1.7.3.jar
netty-3.4.0.Final.jar
avro-ipc-1.7.3.jar
slf4j-api-1.6.1.jar
slf4j-log4j12-1.6.1.jar

這是flume.conf的內容

# Tell agent1 which ones we want to activate.
agent1.channels = ch1
agent1.sources = avro-source1
agent1.sinks = hdfs-sink1

# Define a memory channel called ch1 on agent1
agent1.channels.ch1.type = memory

# Define an Avro source called avro-source1 on agent1 and tell it
# to bind to 0.0.0.0:41414. Connect it to channel ch1.
agent1.sources.avro-source1.type = avro
agent1.sources.avro-source1.bind = 0.0.0.0
agent1.sources.avro-source1.port = 41414

# Define a logger sink that simply logs all events it receives
# and connect it to the other end of the same channel.
agent1.sinks.hdfs-sink1.type = hdfs
agent1.sinks.hdfs-sink1.hdfs.path = hdfs://localhost:9000/flume/events/

agent1.sinks.hdfs-sink1.channel = ch1
agent1.sources.avro-source1.channels = ch1

如何解決這個問題?

我的猜測是您正在嘗試通過Flume記錄Flume的事件。 我已經在其他附加程序中看到了此問題,但Log4j1卻沒有。

我會考慮修改log4j.properties以排除Flume,Netty和Avro事件,並查看是否可以解決該問題。

我曾經在log4j中使用Flume Appender遇到類似的問題。 每當我嘗試實例化Logger對象時,該程序就會掛起。 我記得問題是我在類路徑中沒有所有必需的庫,一旦添加它們,它就可以正常工作。

我建議您首先開始使用Mike Percy的簡單示例 盡管那里的pom.xml創建了一個包含所有依賴項的JAR,但對其進行編輯以將依賴的jar文件復制到另一個目錄,則會得到以下列表:

avro-1.7.4.jar
avro-ipc-1.7.4.jar
commons-codec-1.3.jar
commons-collections-3.2.1.jar
commons-compress-1.4.1.jar
commons-lang-2.5.jar
commons-logging-1.1.1.jar
flume-ng-log4jappender-1.4.0-cdh4.5.0.jar
flume-ng-sdk-1.4.0-cdh4.5.0.jar
hamcrest-core-1.1.jar
httpclient-4.0.1.jar
httpcore-4.0.1.jar
jackson-core-asl-1.8.8.jar
jackson-mapper-asl-1.8.8.jar
jetty-6.1.26.jar
jetty-util-6.1.26.jar
junit-4.10.jar
libthrift-0.7.0.jar
log4j-1.2.16.jar
netty-3.5.0.Final.jar
paranamer-2.3.jar
slf4j-api-1.7.2.jar
slf4j-jdk14-1.7.2.jar
snappy-java-1.0.4.1.jar
velocity-1.7.jar
xz-1.0.jar

這些庫中的一些庫(例如junit)可能確實不是必需的,但是我建議首先使用它們,以查看是否可以使示例正常工作,然后嘗試確定最低要求。

我有一個類似的問題,解決方案是:

  • 將log4j.properties根記錄器從調試更改為信息級別。

但我不知道flume-ng內部發生了什么。 我正在嘗試調試它。 如果有人知道,告訴我~~

暫無
暫無

聲明:本站的技術帖子網頁,遵循CC BY-SA 4.0協議,如果您需要轉載,請注明本站網址或者原文地址。任何問題請咨詢:yoyou2525@163.com.

 
粵ICP備18138465號  © 2020-2024 STACKOOM.COM