简体   繁体   中英

Limits of JVM JIT-Compiler Optimizations

I am currently working on a compiler from a custom DSL to Java while performing some rudementary performance optimizations in the process. My biggest problem is that there are no academic resources to be found about what JIT-Compilers will do in regards to optimization (passes) or to what extend they will do it (eg complex dead-code-elimination, see example below). There are many blog posts saying JIT-Compilers won't do all the optimizations an AOT-Compiler would do because of certain time-constraints, but none mentions what this actually means. Is there a general rule of thumb ? Do I need to dive into eg the OpenJDK C++ source to understand this? Is there any research about this? And if there isn't, is there are least a credible ressource about what kind of optimizations the JVM JIT does ? The latest resources I have found are about Java 5, which is quite outdated ( http://www.oracle.com/technetwork/java/5-136747.html )

Here is a simplified example of a ,,complex dead-code elimination" scenario which I have found the JVM JIT not to be able to eliminate, given the Variable cells_S_S_S is not used anywhere (bear in mind this is auto-generated code):

List<List<List<Cell>>> cells_S_S_S = new ArrayList<>(pla_S.size());
...
for (int pla_S_itr_45 = 0; pla_S_itr_45 < pla_S_size_45; ++pla_S_itr_45) {
        ...
        List<List<Cell>> cells_S_S = new ArrayList<>(tmpVarIf20_S.size());
        for (int tmpVarIf20_S_itr_44 = 0; tmpVarIf20_S_itr_44 < tmpVarIf20_S_size_44; ++tmpVarIf20_S_itr_44) {
            ...
            List<Cell> cells_S = _state.getCells();
            ...
            cells_S_S.add(cells_S);
        }
        ...
        cells_S_S_S.add(cells_S_S);
    }

This sort of ,,nested dead-code" was not eliminated which had me perform said optimizations on my own.

In short: I want to know what the JVM JIT is capable of so that I can focus my own optimization passes on the right areas.

I want to know what the JVM JIT is capable of so that I can focus my own optimization passes on the right areas.

Simple answer: Don't.

You have to consider two things:

  1. Yes, the Oracle HotSpot JVM JIT engine performs a wide range of optimisations during multiple passes. Some of those you listed (dead-code elimination, inlining, de-virtualization, etc...) and many more.
    It's important to note that the behaviour of the JIT Engine is not standardized and JVMs from other company behave in different ways. I've never seen a document describing comprehensively how HotSpot takes its decisions internally or the list of optimizations supported and I highly doubt such document exists (not from Oracle, not from the community). You could dive into the source of the HotSpot VM, but:
  2. HotSpot continuously tries to determine hot spots in your application and in a non-deterministic way decide what needs to be jitted, how to do it in the current context(for hotter methods it makes sense to apply costly optimizations) and which jitted methods need to be discarded and possibly recompiled.
    The state of your application is not stable, and the JIT engine constantly decide what to do with it and choses which set of optimizations will be applied depending on the current environment.

You are trying to optimize the code that you are transpiling from your DSL for a specific JIT behaviour, but every assumption you are making could be valid for one specific run but not for another. Or not being valid anymore after a while, when the jit engine decides to drop the jitted version of your method to free memory or to compile it again with different results.

The only difference between JIT and AOT is that for the latter there are no time constraints, so you try to produce the best code you can, for some measure of quality.

JIT Optimization is actually far more powerful than Whole-Program optimization. JIT Optimization adapts the code temporarily for the best performance and makes extremely unsafe optimizations based on assumptions about the code. For example, the JIT optimizer can precompute something and then rolls it back if the assumption turns wrong. The JIT inliner helps the JIT optimizer by inlining methods so they can be adapted to their specific caller.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM