简体   繁体   中英

I am using spark for my application, but i am getting unnecessary logs. How can i disable logs in spark java application

Below is the logs i am getting in my console.

.spark.executor.Executor       : Finished task 185.0 in stage 189.0 (TID 4477). 11508 bytes result sent to driver
2017-05-06 10:00:18.767  INFO 3336 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 188.0 in stage 189.0 (TID 4480, localhost, executor driver, partition 188, ANY, 6317 bytes)
2017-05-06 10:00:18.769  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.769  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.769  INFO 3336 --- [result-getter-1] o.apache.spark.scheduler.TaskSetManager  : Finished task 185.0 in stage 189.0 (TID 4477) in 75 ms on localhost (executor driver) (185/200)
2017-05-06 10:00:18.769  INFO 3336 --- [launch worker-5] org.apache.spark.executor.Executor       : Running task 188.0 in stage 189.0 (TID 4480)
2017-05-06 10:00:18.770  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.770  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.771  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 2 non-empty blocks out of 201 blocks
2017-05-06 10:00:18.771  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 3 non-empty blocks out of 401 blocks
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.773  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.774  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 1 ms
2017-05-06 10:00:18.775  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 2 non-empty blocks out of 201 blocks
2017-05-06 10:00:18.775  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.777  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 3 non-empty blocks out of 401 blocks
2017-05-06 10:00:18.777  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.786  INFO 3336 --- [launch worker-6] org.apache.spark.executor.Executor       : Finished task 182.0 in stage 189.0 (TID 4474). 11508 bytes result sent to driver
2017-05-06 10:00:18.786  INFO 3336 --- [er-event-loop-1] o.apache.spark.scheduler.TaskSetManager  : Starting task 189.0 in stage 189.0 (TID 4481, localhost, executor driver, partition 189, ANY, 6317 bytes)
2017-05-06 10:00:18.787  INFO 3336 --- [result-getter-2] o.apache.spark.scheduler.TaskSetManager  : Finished task 182.0 in stage 189.0 (TID 4474) in 132 ms on localhost (executor driver) (186/200)
2017-05-06 10:00:18.787  INFO 3336 --- [launch worker-6] org.apache.spark.executor.Executor       : Running task 189.0 in stage 189.0 (TID 4481)
2017-05-06 10:00:18.790  INFO 3336 --- [launch worker-5] org.apache.spark.executor.Executor       : Finished task 188.0 in stage 189.0 (TID 4480). 11356 bytes result sent to driver
2017-05-06 10:00:18.790  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.790  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.791  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.791  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.792  INFO 3336 --- [er-event-loop-2] o.apache.spark.scheduler.TaskSetManager  : Starting task 190.0 in stage 189.0 (TID 4482, localhost, executor driver, partition 190, ANY, 6317 bytes)
2017-05-06 10:00:18.792  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 2 non-empty blocks out of 201 blocks
2017-05-06 10:00:18.792  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.794  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 2 non-empty blocks out of 401 blocks
2017-05-06 10:00:18.794  INFO 3336 --- [launch worker-6] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.796  INFO 3336 --- [launch worker-4] org.apache.spark.executor.Executor       : Finished task 187.0 in stage 189.0 (TID 4479). 11356 bytes result sent to driver
2017-05-06 10:00:18.798  INFO 3336 --- [er-event-loop-0] o.apache.spark.scheduler.TaskSetManager  : Starting task 191.0 in stage 189.0 (TID 4483, localhost, executor driver, partition 191, ANY, 6317 bytes)
2017-05-06 10:00:18.798  INFO 3336 --- [launch worker-5] org.apache.spark.executor.Executor       : Running task 190.0 in stage 189.0 (TID 4482)
2017-05-06 10:00:18.798  INFO 3336 --- [result-getter-3] o.apache.spark.scheduler.TaskSetManager  : Finished task 188.0 in stage 189.0 (TID 4480) in 31 ms on localhost (executor driver) (187/200)
2017-05-06 10:00:18.798  INFO 3336 --- [result-getter-3] o.apache.spark.scheduler.TaskSetManager  : Finished task 187.0 in stage 189.0 (TID 4479) in 35 ms on localhost (executor driver) (188/200)
2017-05-06 10:00:18.800  INFO 3336 --- [launch worker-4] org.apache.spark.executor.Executor       : Running task 191.0 in stage 189.0 (TID 4483)
2017-05-06 10:00:18.801  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.801  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.802  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.802  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.803  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.803  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.803  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 201 blocks
2017-05-06 10:00:18.803  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.804  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 1 blocks
2017-05-06 10:00:18.804  INFO 3336 --- [launch worker-4] o.a.s.s.ShuffleBlockFetcherIterator      : Started 0 remote fetches in 0 ms
2017-05-06 10:00:18.804  INFO 3336 --- [launch worker-5] o.a.s.s.ShuffleBlockFetcherIterator      : Getting 1 non-empty blocks out of 401 blocks

Below is my POM file.

<dependencies>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-data-rest</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web</artifactId>
    </dependency>
    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-web-services</artifactId>
    </dependency>
     <dependency>
        <groupId>com.fasterxml.jackson.core</groupId>
        <artifactId>jackson-databind</artifactId>
        </dependency> 
    <dependency>
        <groupId>info.debatty</groupId>
        <artifactId>java-string-similarity</artifactId>
        <version>RELEASE</version>
    </dependency>



    <dependency>
        <groupId>com.univocity</groupId>
        <artifactId>univocity-parsers</artifactId>
        <version>2.3.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.11</artifactId>
        <version>2.1.0</version>
    </dependency>
    <dependency>
        <groupId>org.codehaus.janino</groupId>
        <artifactId>commons-compiler</artifactId>
        <version>2.6.1</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.11</artifactId>
        <version>2.1.0</version>
    </dependency>
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.11</artifactId>
        <version>2.1.0</version>
    </dependency>
        <dependency>
        <groupId>com.oracle</groupId>
        <artifactId>ojdbc6</artifactId>
        <version>11.2.0.3</version>
    </dependency>


    <dependency>
       <groupId>org.apache.spark</groupId>
       <artifactId>spark-network-common_2.10</artifactId>
       <version>1.4.0</version>
      </dependency>
      <dependency>
       <groupId>org.codehaus.janino</groupId>
       <artifactId>commons-compiler</artifactId>
       <version>2.7.5</version>
      </dependency>


    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-test</artifactId>
        <scope>test</scope>
    </dependency>
</dependencies>

I think you can change log level as

sparkContext.setLogLevel("WARN")

You can choose the log level among

ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN

If the log printed is in the spark-shell then you can change the log level from config file located at conf/log4j.properties (change the name from conf/log4j.properties.template ) then change the log level as you want

log4j.rootCategory=INFO, console

and reopen the shell you will see less output.

Set this setting spark.history.fs.cleaner.enabled to true in spark-default-conf. This will help in purging the logs from the hdfs after 7 days by default, which can be altered by setting spark.history.fs.cleaner.maxAge

I use below code in scala : Logger.getLogger("org").setLevel(Level.OFF) Logger.getLogger("akka").setLevel(Level.WARN)

you can try something like above in Java.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM