繁体   English   中英

elasticsearch CPU使用率高

[英]elasticsearch high cpu usage

我有5个群集节点,每个节点都有1个副本。 文件总大小为216 M和853,000个文件。 我的CPU使用率过高。 每小时和每个清晨大约60%〜80%大约是凌晨05:00〜凌晨09:00仅在此服务器上有elasticsearch

我认为es程序有问题。 但是在cpu高峰时间有一些服务器请求。 而且甚至没有cron工作。

每个小时和每个清晨大约05:00〜09:00我不知道此时弹性搜索发生了什么! 有人帮我,告诉我那里发生了什么。 请..

$ ./elasticsearch -v 
Version: 1.1.1, Build: f1585f0/2014-04-16T14:27:12Z, JVM: 1.7.0_55 

$ java -version 
java version "1.7.0_55" 
Java(TM) SE Runtime Environment (build 1.7.0_55-b13) 
Java HotSpot(TM) 64-Bit Server VM (build 24.55-b03, mixed mode) 

然后我在elasticsearch上安装了插件:总部,bigdesk,head,kopf,sense

es在cpu高峰时间记录:

[2014-07-03 08:01:00,045][DEBUG][action.search.type       ] [node1] [search][4], node[GJjzCrLvQQ-ZRRoqL13MrQ], [P], s[STARTED]: Failed to execute [org.elasticsearch.action.search.SearchRequest@451f9e7c] lastShard [true] 
org.elasticsearch.common.util.concurrent.EsRejectedExecutionException: rejected execution (queue capacity 300) on org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction$4@68ab486b 
    at org.elasticsearch.common.util.concurrent.EsAbortPolicy.rejectedExecution(EsAbortPolicy.java:62) 
    at java.util.concurrent.ThreadPoolExecutor.reject(Unknown Source) 
    at java.util.concurrent.ThreadPoolExecutor.execute(Unknown Source) 
    at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.onFirstPhaseResult(TransportSearchTypeAction.java:293) 
    at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.onFirstPhaseResult(TransportSearchTypeAction.java:300) 
    at org.elasticsearch.action.search.type.TransportSearchTypeAction$BaseAsyncAction.start(TransportSearchTypeAction.java:190) 
    at org.elasticsearch.action.search.type.TransportSearchQueryThenFetchAction.doExecute(TransportSearchQueryThenFetchAction.java:59) 
    at org.elasticsearch.action.search.type.TransportSearchQueryThenFetchAction.doExecute(TransportSearchQueryThenFetchAction.java:49) 
    at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:63) 
    at org.elasticsearch.action.search.TransportSearchAction.doExecute(TransportSearchAction.java:108) 
    at org.elasticsearch.action.search.TransportSearchAction.doExecute(TransportSearchAction.java:43) 
    at org.elasticsearch.action.support.TransportAction.execute(TransportAction.java:63) 
    at org.elasticsearch.client.node.NodeClient.execute(NodeClient.java:92) 
    at org.elasticsearch.client.support.AbstractClient.search(AbstractClient.java:212) 
    at org.elasticsearch.rest.action.search.RestSearchAction.handleRequest(RestSearchAction.java:98) 
    at org.elasticsearch.rest.RestController.executeHandler(RestController.java:159) 
    at org.elasticsearch.rest.RestController.dispatchRequest(RestController.java:142) 
    at org.elasticsearch.http.HttpServer.internalDispatchRequest(HttpServer.java:121) 
    at org.elasticsearch.http.HttpServer$Dispatcher.dispatchRequest(HttpServer.java:83) 
    at org.elasticsearch.http.netty.NettyHttpServerTransport.dispatchRequest(NettyHttpServerTransport.java:291) 
    at org.elasticsearch.http.netty.HttpRequestHandler.messageReceived(HttpRequestHandler.java:43) 
    at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) 
    at org.elasticsearch.common.netty.handler.codec.http.HttpChunkAggregator.messageReceived(HttpChunkAggregator.java:145) 
    at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) 
    at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:296) 
    at org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(FrameDecoder.java:459) 
    at org.elasticsearch.common.netty.handler.codec.replay.ReplayingDecoder.callDecode(ReplayingDecoder.java:536) 
    at org.elasticsearch.common.netty.handler.codec.replay.ReplayingDecoder.messageReceived(ReplayingDecoder.java:435) 
    at org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(SimpleChannelUpstreamHandler.java:70) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(DefaultChannelPipeline.java:791) 
    at org.elasticsearch.common.netty.OpenChannelsHandler.handleUpstream(OpenChannelsHandler.java:74) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:564) 
    at org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(DefaultChannelPipeline.java:559) 
    at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:268) 
    at org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(Channels.java:255) 
    at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(NioWorker.java:88) 
    at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(AbstractNioWorker.java:108) 
    at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(AbstractNioSelector.java:318) 
    at org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(AbstractNioWorker.java:89) 
    at org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(NioWorker.java:178) 
    at org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(ThreadRenamingRunnable.java:108) 
    at org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(DeadLockProofWorker.java:42) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source) 
    at java.lang.Thread.run(Unknown Source)

您是否100%确定发生这种情况时只有几个请求在执行?

日志表明有太多查询正在运行,以至于拒绝新查询,我希望bigdesk能够显示出如此大量的查询。

必须有某种批处理/自动化过程使查询泛滥。 去那里做了几次。

您应该检查索引慢日志并可能需要调整时间,以便注销大部分查询(短时间内)。 有关更多详细信息,请参见此处: http : //www.elasticsearch.org/guide/en/elasticsearch/reference/current/index-modules-slowlog.html

暂无
暂无

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM