简体   繁体   中英

Java OutOfMemory error in filter

I get a java.lang.outOfMemoryError exception, while writing a big file to the servletOutputStream. Every response is by default wrapped using a ehcache.constructs.web.filter Object for GZIP compression. And as per the logs, the exception is thrown in the Filter object. Is there a way to increase the available memory so, that the outOfMemoryError Exception does not occur ?

You need to add the option "-Xmx1024m" to the java command which runs your servlet container. (Replace 1024m with whatever heap size you like; this means 1024 megabytes.) If you're using, say, Tomcat, then this means you set this in the environment variable CATALINA_OPTS. Your server configuration will vary.

However, the problem is buffering such a big file in memory. This is wasteful. Try this compression filter, which doesn't buffer this way:

http://sourceforge.net/projects/pjl-comp-filter/

Use -Xmx Java command line option as shown below

java -Xms256m -Xmx512m com.test.MyMain

Xms represents lower end of memory allocation and Xmx represents the upper end of memory allocation

为您的servlet容器-Xmx256m -Xms128m设置以下JVM选项(在Tomcat中,它在catalina.sh / catalina.bat

Don't forget about possibly needing to increase your PermGen size:

-XX:PermSize=64m -XX:MaxPermSize=256m

Also do make sure you are efficiently streaming out the file. There may be unnecessarily large buffering in the output or inputstream pipe.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM