简体   繁体   English

这是Xpath评估中的Chrome错误吗

[英]Is this a Chrome bug in Xpath evaluation

Open this page: http://sunnah.com/abudawud/2 打开此页面: http : //sunnah.com/abudawud/2

And run this simple xpath search query in the console. 并在控制台中运行此简单的xpath搜索查询。 Then the browser tab crashes 然后浏览器标签崩溃

for(var k=0, kl=2000; k < kl; k++){
    console.log(k);
    var xpathResult = document.evaluate("//div[@class='hello']", document, null, XPathResult.ANY_TYPE, null);
}

On chrome Version 46.0.2490.80 (64-bit) on Macbook Pro, OSX 10.10.5 在Macbook Pro,OSX 10.10.5的Chrome版本46.0.2490.80(64位)上

Unfortunately i have to run the xpath on this page a couple of thousands of time to search for different elements. 不幸的是,我必须在此页面上运行xpath数千次才能搜索不同的元素。 So i can't get away with not doing that many calls to evaluate. 所以我不能不做那么多电话来评估。

The crash is dependent on the xpath term. 崩溃取决于xpath项。 for some terms it crashes and for some others it does not. 在某些情况下会崩溃,在某些情况下不会崩溃。

It fails consistently on the same count so it makes me think it is not a timing issue or garbage collection issue. 它以相同的次数连续失败,所以让我认为这不是时间问题或垃圾回收问题。

I am not getting any error codes so I am not sure where else to look. 我没有收到任何错误代码,所以不确定其他地方。

在此处输入图片说明

Update After further investigation we believe this is a legitimate Chrome bug or at least not very good way of releasing memory. 更新经过进一步调查,我们认为这是合法的Chrome错误,或者至少不是很好的释放内存的方法。 What happens is that if your xpath starts with / or // then the search context is expanded to all of DOM and for some reason chrome keeps the DOM or some other intermediary object in memory. 发生的事情是,如果您的xpath以/或//开头,则搜索上下文将扩展到所有DOM,并且由于某种原因,chrome将DOM或其​​他中间对象保留在内存中。 If the xpath does starts with relative path like (div/p) and the search scope ( second argument) is set to portions of the DOM the memory consumption is much more reasonable and there is no crash. 如果xpath确实以(div / p)之类的相对路径开头,并且将搜索范围(第二个参数)设置为DOM的一部分,则内存消耗将更为合理,并且不会发生崩溃。 Thanks to @JLRishe for several hints that were very helpful to get to this conclusion. 感谢@JLRishe提供了一些有助于得出此结论的提示。

Update2 I filed a bug on chromium. Update2我提出了关于铬的错误。 But after a few months they rejected the bug as wont-fix. 但是几个月后,他们拒绝了该漏洞,将其作为不会修复的漏洞。 I managed to work around it for the time being. 我暂时设法解决它。

If I run your code on that page and watch Task Manager, I can see Chrome's working set increase to about 3.3 GB before it eventually crashes after about 1300 iterations. 如果我在该页面上运行代码并观看任务管理器,则可以看到Chrome的工作集增加到了约3.3 GB,然后它在经过约1300次迭代后最终崩溃了。

Each XPath query is causing Chrome to allocate memory for the results and any operation involved in obtaining them, but it seems like it is not releasing any of the allocated memory because you are not releasing control of the thread. 每个XPath查询都会导致Chrome为结果以及获取结果所涉及的任何操作分配内存,但是由于您没有释放对线程的控制,因此它似乎没有释放任何分配的内存。

I have found that the working set levels out at 1.65 GB and the operation finishes without crashing if I do this: 我发现工作集的大小为1.65 GB,如果执行此操作,操作将完成而不会崩溃:

var k = 0;
var intv = setInterval(function () {
    console.log(k);
    var xpathResult = document.evaluate("//div[@class='hello']", document, null, XPathResult.ANY_TYPE, null);
    k += 1;
    if (k >= 2000) {
        clearInterval(intv);
    }
}, 0);

so something like that might be a possible solution. 因此可能是一种可能的解决方案。

This is still using considerable system resources, and this isn't even including any values you might be storing in the course of your operation. 这仍在使用大量系统资源,甚至不包括您在操作过程中可能存储的任何值。 I encourage you to seek out a smarter approach that doesn't require running quite so many XPath queries. 我鼓励您寻求一种更聪明的方法,该方法不需要运行太多XPath查询。

声明:本站的技术帖子网页,遵循CC BY-SA 4.0协议,如果您需要转载,请注明本站网址或者原文地址。任何问题请咨询:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM