[英]How to solve the 'Lock obtain timed out' when using Solr plainly?
I have two cores for our Solr system (Solr version 3.6.1). 我的Solr系统有两个内核(Solr版本3.6.1)。 When I invoke the following command line on our dedicated Solr server to add and then index a file: 当我在我们的专用Solr服务器上调用以下命令行来添加然后索引文件时:
java -Durl=http://solrprod:8080/solr/original/update -jar /home/solr/solr3/biomina/solr/post.jar /home/solr/tmp/2008/c2m-dump-01.noDEID_clean.xml
I get an exception in /usr/share/tomcat7/logs/solr.2013-12-11.log
file (after about 6 minutes of waiting): 我在/usr/share/tomcat7/logs/solr.2013-12-11.log
文件中得到一个异常(等待大约6分钟后):
SEVERE: org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: NativeFSLock@/home/solr/solr3/biomina/solr/original/data/index/write.lock
(You can see the detailed output of it at the end of this message). (您可以在此消息的末尾看到它的详细输出)。
I tried to modify the time-out for locks (by setting writeLockTimeout
to 300000
) , but this did not solve the problem. 我试图修改锁的超时(通过将writeLockTimeout
设置为300000
),但这并没有解决问题。 I'm not using any custom script, just the post.jar
that comes with Solr 3.1.6, to add and index. 我没有使用任何自定义脚本,只是Solr 3.1.6附带的post.jar
来添加和索引。
Any ideas about what needs to be changed to get rid of this error and successfully add the XML file about to Solr and index it? 有关需要更改哪些内容以消除此错误并成功将XML文件添加到Solr并将其编入索引的任何想法?
Contents of /home/solr/solr3/biomina/solr/solr.xml
: /home/solr/solr3/biomina/solr/solr.xml
内容:
<?xml version="1.0" encoding="UTF-8" ?>
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<!--
All (relative) paths are relative to the installation path
persistent: Save changes made via the API to this file
sharedLib: path to a lib directory that will be shared across all cores
-->
<solr persistent="true">
<!--
adminPath: RequestHandler path to manage cores.
If 'null' (or absent), cores will not be manageable via request handler
-->
<cores adminPath="/admin/cores">
<core name="original" instanceDir="original" />
<core name="deidentified" instanceDir="deidentified" />
</cores>
</solr>
Relevat part of solrconfig.xml (for the core named original
): 重新绑定solrconfig.xml的一部分(对于名为original
的核心):
<indexConfig>
<!-- maxFieldLength specifies max number of *tokens* indexed per
field. Default: 10000 -->
<!-- <maxFieldLength>10000</maxFieldLength> -->
<!-- Maximum time to wait for a write lock (ms) for an IndexWriter.
Default: 1000 -->
<writeLockTimeout>300000</writeLockTimeout>
Relevat part of solrconfig.xml (for the core named deidentified
): solrconfig.xml中的Relevat部分(名为核心deidentified
):
<indexConfig>
<!-- maxFieldLength specifies max number of *tokens* indexed per
field. Default: 10000 -->
<!-- <maxFieldLength>10000</maxFieldLength> -->
<!-- Maximum time to wait for a write lock (ms) for an IndexWriter.
Default: 1000 -->
<writeLockTimeout>300000</writeLockTimeout>
Detailed Output of Exception 详细的例外输出
Dec 11, 2013 11:27:25 AM org.apache.solr.core.SolrCore execute
INFO: [original] webapp=/solr path=/update params={} status=500 QTime=300070
Dec 11, 2013 11:32:25 AM org.apache.solr.common.SolrException log
SEVERE: org.apache.lucene.store.LockObtainFailedException: Lock obtain timed out: NativeFSLock@/home/solr/solr3/biomina/solr/original/data/index/write.lock
at org.apache.lucene.store.Lock.obtain(Lock.java:84)
at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:1098)
at org.apache.solr.update.SolrIndexWriter.<init>(SolrIndexWriter.java:84)
at org.apache.solr.update.UpdateHandler.createMainIndexWriter(UpdateHandler.java:101)
at org.apache.solr.update.DirectUpdateHandler2.openWriter(DirectUpdateHandler2.java:171)
at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:219)
at org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:61)
at org.apache.solr.update.processor.LogUpdateProcessor.processAdd(LogUpdateProcessorFactory.java:115)
at org.apache.solr.handler.XMLLoader.processUpdate(XMLLoader.java:157)
at org.apache.solr.handler.XMLLoader.load(XMLLoader.java:79)
at org.apache.solr.handler.ContentStreamHandlerBase.handleRequestBody(ContentStreamHandlerBase.java:58)
at org.apache.solr.handler.RequestHandlerBase.handleRequest(RequestHandlerBase.java:129)
at org.apache.solr.core.SolrCore.execute(SolrCore.java:1376)
at org.apache.solr.servlet.SolrDispatchFilter.execute(SolrDispatchFilter.java:365)
at org.apache.solr.servlet.SolrDispatchFilter.doFilter(SolrDispatchFilter.java:260)
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:243)
at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:210)
at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:222)
at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:123)
at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502)
at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:171)
at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:99)
at org.apache.catalina.valves.AccessLogValve.invoke(AccessLogValve.java:953)
at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:118)
at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:408)
at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1023)
at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:589)
at org.apache.tomcat.util.net.JIoEndpoint$SocketProcessor.run(JIoEndpoint.java:310)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1156)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:626)
at java.lang.Thread.run(Thread.java:804)
Dec 11, 2013 11:32:25 AM org.apache.solr.core.SolrCore execute
INFO: [original] webapp=/solr path=/update params={} status=500 QTime=556916
System details: 系统细节:
uname -a
Linux solrprod 3.0.93-0.8-default #1 SMP Tue Aug 27 08:44:18 UTC 2013 (70ed288) x86_64 x86_64 x86_64 GNU/Linux
java -version
java version "1.7.0"
Java(TM) SE Runtime Environment (build pxa6470sr6-20131015_01(SR6))
IBM J9 VM (build 2.6, JRE 1.7.0 Linux amd64-64 Compressed References 20131013_170512 (JIT enabled, AOT enabled)
J9VM - R26_Java726_SR6_20131013_1510_B170512
JIT - r11.b05_20131003_47443
GC - R26_Java726_SR6_20131013_1510_B170512_CMPRSS
J9CL - 20131013_170512)
JCL - 20131011_01 based on Oracle 7u45-b18
The following modifications solved the issue: 以下修改解决了该问题:
Applied the changes described at https://stackoverflow.com/a/3035916/236007 应用https://stackoverflow.com/a/3035916/236007中描述的更改
Switched to Oracle Java runtime (it was IBM Java runtime). 切换到Oracle Java运行时(它是IBM Java运行时)。
Put the ulimit -v unlimited
in /etc/init.d/tomcat7
. 将ulimit -v unlimited
放在/etc/init.d/tomcat7
。
Modified the /usr/share/tomcat7/bin/setenv.sh
file as the following (giving it about 4 GB memory): 修改/usr/share/tomcat7/bin/setenv.sh
文件如下(给它大约4 GB的内存):
export JAVA_OPTS="$JAVA_OPTS -Xmx4000m -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/mnt/data/tomcat_dump"
Got issues with Lock obtain timed out
for write.lock
file. Lock obtain timed out
问题是Lock obtain timed out
write.lock
文件的Lock obtain timed out
。 It occured after a pretty hard reset and it turned out that I, after restart, had two processes running since the first had been ungracefully killed before. 它发生在一次非常困难的重置之后,事实证明我在重新启动之后有两个进程在运行,因为第一个进程被非常不合理地杀死了。
Running ps aux | grep solr
运行ps aux | grep solr
ps aux | grep solr
and killing the corrupted process and let the other start up then solved the issue. ps aux | grep solr
并杀死损坏的进程并让其他启动然后解决了问题。
I had these error for general Lucene library usage and the problem were file system errors, ie the reproducible error disappeared after fsck
with repair. 我有一般Lucene库使用的这些错误,问题是文件系统错误,即fsck
修复后可重现的错误消失了。 I add this answer in this question, as I found this question first. 我在这个问题中添加了这个答案,因为我首先发现了这个问题。
声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.