Continuous Delivery makes JVM JIT an antipattern

The JVM JIT compiler has long been sold as the way Java is able to compete with the performance of compiled languages like C/C++. Every benchmark for Java will tell you to first run your code many times for 'warmup' before you measure its performance, so that its JITed and optimized by the JVM's C2 compiler .

In the real world though, the calls to your application before its code is 'warmed up' are very much a part of your application's experience.

JVM JIT and our Signup page

Astradot has a Kotlin microservice that takes care of auth activities like signup and login. If you try to signup right after the serviced was redeployed, it feels like the signup page has frozen after you click the 'signup' button. The page can take seconds to respond. It's because the JVM is loading the code of tons of Kotlin/Spring classes for the first time and running it through the interpreter with no optimizations. Sure the response time gets better the more you click the signup button, but the user who was signing up that first time might have thought our system was frozen and gone away. Since we have multiple instances of each microservice running its possible that the 2nd time you try to signup the request goes to a different JVM instance. For that JVM its the first time loading the signup code and so you again encounter the freezing behavior. From the end user's perspective, he has now tried signing up multiple times and encountered slow behavior each time. Thus that is the impression he has of our product now.

Continuous Delivery kills JIT compiler's core assumption

One of Astradot's metric collector service gets 500 requests per second per JVM. After a fresh deploy, even at that high throughput it takes a full 2 hours till the JVM C2 compiler is able to fully optimize that code path to get response times drop to their lowest. To put those 2 hours in context, here is a result from Sysdig's latest container usage survey:

74% of containers have lifespans ≤ 1 hour. This changes the core assumption behind the JIT compiler that the JVM is a long running process. Your container will get redeployed before it gets optimized by the JVM C2 compiler. Thus your users will never even get to experience that amazing performance that all those JVM benchmarks promised.

This gets worse for portions of code that are low throughput. Think of that Signup page I talked about earlier. Even if our auth microservice was deployed for days, the signup() function will still not get enough calls to trigger the C2 compiler to fully optimize it. So users will always experience the unoptimized version of that code.

Rise of modern compiled languages

One of the selling points of the JVM JIT compiler was that it has runtime information so it can do better optimization. That might have been true 20 years ago. But Ahead of Time (AOT) compiled languages have evolved since then. Go, which is Garbage Collected like Java, but AOT compiled, is able to achieve similar or better performance. Rust is able to consistently beat Java in benchmarks.

This is due to the fundamental design of Java. It encourages uses of virtual methods and allocations on heap. A huge part of the JIT optimization revolves around trying to convert those virtual calls to static calls, inline them, perform escape analysis to convert those heap allocations to stack allocations. Go and Rust encourage use of static method calls and stack allocation everywhere by default thus they don't need all the complexity and overhead of a massive JIT to optimize them at runtime.

AOT Compiled Java

There are signs that Java folks are realizing the pitfalls of JIT. GraalVM has an AOT compiler and frameworks like Quarkus and Micronaut are popping up to use them. They have had little uptake though. The dynamic nature of Java means that features like dynamic class loading, reflection, proxies, etc are unavailable or in limited from in AOT. Production Java apps also typically run with APM tracing agents that rely on runtime bytecode instrumentation. The entire JVM ecosystem is simply not designed around AOT compilation. Molding a 25 year old runtime ecosystem to adapt to AOT compilation feels like putting lipstick on a pig. It is easier to start afresh with modern compiled languages like Go and Rust.

Conclusion

JVM vendors want you to ignore the fact that large portions of your code could indeed be running on the interpreter or the unoptimized C1 compiler. Continuous Delivery and the resulting frequent JVM restarts mean the core assumption behind the JIT compiler, that JVMs are long running processes, no longer holds.

At Astradot, we believe the era of the JVM is coming to an end. We are writing our backend in AOT compiled languages to give you a great experience 100% of the time.  We recently converted our microservices from Kotlin to Go and found it to be a welcome change.