While running Stanford's CoreNLP 3.7.0, I'm ocaisionally seeing this:
WARNING: Parsing of sentence failed, possibly because of out of memory.
Will ignore and continue: Just a year ago , the public outrage was over
Sovaldi , a new hepatitis C drug being sold by Gilead Sciences for
$ 1,000 a pill , or $ 84,000 for a course of treatment .
I've seen this before, but that was when sentence splitting messed up, and gave a sentence that was very long. The cases I'm seeing now, like the one above, have reasonably sized, correctly split sentences.
Why might this happen, and what should I do to fix it?
Whilst this isn't exactly the answer to why this happens, I worked around this by using the 64-bit JRE and increasing the available heap. I'm not sure what environment and IDE you're using but to do this in Eclipse on Windows you have to do three things:
-Xmx30G
This sets the maximum heap to 30Gb (I'm not sure it's restricted by physical memory as I don't have that much) and your project should run again. Be aware that those things it was falling over on before will likely be processed rather slowly.
The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.