简体   繁体   English

Solr日志未直接推送到kafka,Solr无法连接到ZK

[英]Solr logs not being pushed directly to kafka, Solr cannot connect to ZK

I am trying to send logs from solr directly to kafka using log4j. 我正在尝试使用log4j将日志从solr直接发送到kafka。 While the logs will be printed to stdout, no data arrives in kafka. 虽然日志将打印到标准输出,但没有数据到达kafka。 I am able to push data to kafka with the command line producer. 我能够与命令行生产者将数据推送到kafka。

The warning and error I am getting: 我得到的警告和错误:

WARN  - 2015-01-19 12:09:25.545; org.apache.solr.cloud.Overseer$ClusterStateUpdater; Solr cannot talk to ZK, exiting Overseer main queue loop                                                                                                                             
INFO  - 2015-01-19 12:09:25.552; org.apache.solr.cloud.Overseer$ClusterStateUpdater; Overseer Loop exiting : 10.254.120.50:8900_solr 
WARN  - 2015-01-19 12:09:25.554; org.apache.solr.common.cloud.ZkStateReader$2; ZooKeeper watch triggered, but Solr cannot talk to ZK 
ERROR - 2015-01-19 12:09:25.560; org.apache.solr.cloud.Overseer$ClusterStateUpdater; could not read the data                         
org.apache.zookeeper.KeeperException$SessionExpiredException: KeeperErrorCode = Session expired for /overseer_elect/leader     

My Log4j.Properties file: 我的Log4j.Properties文件:

 solr.log=/home/solradmin/solr/latest/logs/                                                                   
    log4j.rootLogger=INFO, file, KAFKA                                                                                                   
    log4j.logger.KAFKA=INFO, file                                                                                                        
    log4j.logger.solr=INFO, KAFKA  

    log4j.appender.stdout=org.apache.log4j.ConsoleAppender                                                       
    log4j.appender.stdout.layout=org.apache.log4j.PatternLayout                                                  
    log4j.appender.stdout.layout.ConversionPattern=%5p [%t] (%F:%L) - %m%n 

    log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender                                                                       
    log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout                                                   
    log4j.appender.KAFKA.layout.ConversionPattern=%-5p: %c - %m%n                                                                        
    log4j.appender.KAFKA.BrokerList=localhost:9092                                                                                       
    log4j.appender.KAFKA.Topic=herpderp                                                                               

    log4j.appender.file=org.apache.log4j.RollingFileAppender                                                                             
    log4j.appender.file.MaxFileSize=100MB                                                                                                
    log4j.appender.file.MaxBackupIndex=9   
                                                                                                                                                                                                                                                                                                                log4j.appender.file.File=${solr.log}/solr.log                                                                                        
    log4j.appender.file.layout=org.apache.log4j.PatternLayout                                                                            
    log4j.appender.file.layout.ConversionPattern=%-5p - %d{yyyy-MM-dd HH:mm:ss.SSS}; %C; %m\n         
    log4j.logger.org.apache.solr=DEBUG                                                                                                   
    log4j.logger.org.apache.zookeeper=WARN                                                                                       
    log4j.logger.org.apache.hadoop=WARN    

The log4j documentation does not list kafka as a supported appender. log4j文档未将kafka列为受支持的附加程序。 Yet the kafka documentation shows that log4j is easy to configure. 然而, kafka文档显示 log4j易于配置。

Does log4j require some sort of plugin to support kafka? log4j是否需要某种插件来支持kafka?

I tried different configurations using the following sources: http://kafka.apache.org/07/quickstart.html and KafkLog4JAppender not pushing application logs to kafka topic . 我使用以下资源尝试了不同的配置: http : //kafka.apache.org/07/quickstart.htmlKafkLog4JAppender未将应用程序日志推送到kafka topic

Make sure the root logger doesn't log to itself. 确保root记录器未自行记录。

#bad
log4j.rootLogger=INFO, file, KAFKA                                                                                                   
#good
log4j.rootLogger=INFO, file 

My full config 我的完整配置

log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%5p [%t] (%F:%L) - %m%n

log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout
log4j.appender.KAFKA.layout.ConversionPattern=%-5p: %c - %m%n
log4j.appender.KAFKA.BrokerList=kafka1.example.com:6667,kafka2.example.com:6667,kafka3.example.com:6667
log4j.appender.KAFKA.Topic=foobar
log4j.rootLogger=DEBUG,file

log4j.appender.file=org.apache.log4j.RollingFileAppender
log4j.appender.file.MaxFileSize=100MB
log4j.appender.file.MaxBackupIndex=9

solr.log=/home/solradmin/solr/latest/logs
#- File to log to and log format
log4j.appender.file.File=${solr.log}/solr.log
log4j.appender.file.layout=org.apache.log4j.PatternLayout
log4j.appender.file.layout.ConversionPattern=%-5p - %d{yyyy-MM-dd HH:mm:ss.SSS}; %C; %m\n
log4j.logger.org.apache.solr=DEBUG,KAFKA
log4j.logger.org.apache.zookeeper=WARN,KAFKA
log4j.logger.org.apache.hadoop=WARN

# set to INFO to enable infostream log messages
log4j.logger.org.apache.solr.update.LoggingInfoStream=OFF 

Java application Java应用

package nd.KafkaTest;

import org.slf4j.Logger;
import org.slf4j.LoggerFactory;

/**
 * Hello world!
 *
 */
public class App 
{
    public static void main( String[] args )
    {
        Logger logger = LoggerFactory.getLogger(App.class.getName());
        System.out.println( "Hello World!" );
        int i= 100;
        while (i>0)
        {

               logger.debug("Debugging!." + i); 
               logger.info("Exiting application." + i);
               i--;
        }
        System.out.println("here you go");

    }
}

Pom.xml 的pom.xml

<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
  xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
  <modelVersion>4.0.0</modelVersion>

  <groupId>nd</groupId>
  <artifactId>KafkaTest</artifactId>
  <version>0.0.1-SNAPSHOT</version>
  <packaging>jar</packaging>

  <name>KafkaTest</name>
   <url>http://maven.apache.org</url>
<build>
    <plugins>
      <plugin>
        <groupId>org.apache.maven.plugins</groupId>
        <artifactId>maven-jar-plugin</artifactId>
        <configuration>
          <archive>
            <manifest>
              <addClasspath>true</addClasspath>
              <mainClass>logToKafka.App</mainClass>
            </manifest>
          </archive>
        </configuration>
      </plugin>
    </plugins>
  </build>
  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  </properties>

  <dependencies>
   <dependency>
              <groupId>org.apache.kafka</groupId>
              <artifactId>kafka_2.9.2</artifactId>
              <version>0.8.2.1</version>
       </dependency>

       <dependency>
              <groupId>org.slf4j</groupId>
              <artifactId>slf4j-api</artifactId>
              <version>1.7.6</version>
       </dependency>
<!--        <dependency> -->
<!--               <groupId>org.slf4j</groupId> -->
<!--               <artifactId>slf4j-log4j12</artifactId> -->
<!--               <version>1.7.12</version> -->
<!--        </dependency>  -->
       <dependency>
              <groupId>log4j</groupId>
              <artifactId>log4j</artifactId>
              <version>1.2.17</version>
       </dependency>


  </dependencies>
</project>

Note that solr doesn't use log4j, it uses a wrapper called slf4j 请注意,solr不使用log4j,它使用一个称为slf4j的包装器

./bin
jcl-over-slf4j-1.6.6.jar  jul-to-slf4j-1.6.6.jar  kafka_2.10-0.8.2.1.jar     log4j-1.2.16.jar  scala-library-2.9.2.jar  slf4j-log4j12-1.7.6.jar
jcl-over-slf4j-1.7.6.jar  jul-to-slf4j-1.7.6.jar  kafka-clients-0.8.2.1.jar  log4j.properties  slf4j-api-1.7.6.jar

Credit: Meet Rajdev and I are coworkers and worked on this together. 信用:与Rajdev见面,我是同事,并为此共同努力。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM