Java is undergoing a change that will make it much more alluring for certain purposes. Thanks to Project Valhalla Java is going to give the developer the ability to code low-level operations with much higher performance. Here’s what it will bring to the table.

From its very beginning, Java was an object-oriented language – it always was and still is impossible to create a class that does not extend Java’s core Object class. But there are non-object components in it, too. If you are one of the inquisitive Java developers and want to learn how things work, you have likely wondered about these – the primitives. There aren’t that many of them, but they are – in fact – everywhere. They are not objects and they have their own rules, significantly different from Java’s object rules.

In this article, I’ll spin a tale of the struggle of joining the worlds of objects and primitives in Java. It’s a story of the road to Valhalla – an ongoing project that may change Java in substantial ways (assuming it ever fully comes to pass). What is project Valhalla? How will it affect your work? Read on to find out.

An image showing a viking shield.

Primitive types vs objects

Let’s start with the topic of primitive types and objects. So… have you ever encountered the quirks of primitives in Java? For example, it is perfectly ok to do this:

Integer i = 1337;

But it is not okay to do that:

Long l = 1337;

Why is that? It seems kind of the same, doesn’t it? Well, it isn’t. It will work, if you do it like this:

Long l = (long) 1337;

Or even better, like this:

Long l = 1337L;

The answer, as for all Java-related topics, can be found in Java Language Specification: it’s the Boxing conversion and Unboxing conversion. Notice that the boxed/unboxed types are explicitly listed, no other automatic conversions are done. A number like 1337 is, in fact, an int. Hence it is possible to box it to Integer, as such conversion exists.

The important thing to notice is that there is an explicit list of boxing and unboxing conversions. There is no general rule, though one can deduce how the particular conversions were formed… So why is there no rule? I presume that it’s because there is no axiomatic definition in JLS that would allow one to form that rule. Remember, that this is a language specification – just writing “convert the primitive to its object type” won’t do. And since there’s no general rule, it’s very likely that somewhere in JVM code there’s a switch that handles exactly that conversion sequence. Drop me a line at @vaverDariush if you think that I’m right or wrong – I won’t try to dig for that information for the purpose of this article.

One of the things that Valhalla would introduce is… exactly that general rule. And perhaps it could be applied to some specific “value objects”? I’ll get back to that later.

Arrays and arrays

Now, let’s take a look at arrays, as they can show quite well why Valhalla could be useful. First, let’s create two arrays:

long[] primitiveArray = new long[100];
Long[] objectArray = new Long[100];

These two are vastly different when you look at them from the perspective of Java and JVM.

First – how much memory does each take? One could check it programmatically with Instrumentation, but for this article, let’s assume they both take 400 bytes, which would be exactly as much as the arrays’ content – 100 Longs and 100 references to a Long object. References! So, if you fill that array with objects, and a Long takes 24 bytes on 64-bit systems, the actual size of this data will be 2400 bytes. That, plus the 400 bytes for an array, gives us 2800 bytes in total. It’s 7 times as much! Not to mention possible memory fragmentation. I’ll get back to that issue later, too.

What are these array contents initially? Primitives are values, so they can’t be null. Java’s rules say that the default value for primitives like int is 0. So the primitiveArray is a zero-filled array, like a calloc()’ed memory fragment in C. There are more similarities to C here, as this array is a coherent memory fragment. When you access the 4th element, you simply access the memory location of the object + object overhead + 4 times primitive size. In other words, the 5th element is just after the 4th. And 6th after 5th. And so on.

This is not the case for the object array. To get the actual number value at the same 4th index, you access the same 4th element of the array, but then you read the reference value and access the memory at the referenced location, then you skip the object overhead bytes, and then you can get the value. Now, imagine doing that in a loop. If one would inspect and compare the bytecode, especially with optimizations turned off, one would discover that there are a lot more instructions done. To wrap up the entire train of thought – the object array is filled with nulls initially. So, to fill it, one has to instantiate the objects and assign all the references in the array. Lots of work.

Let me stress one thing: the JVM handling of both types of arrays is very different. From the programmer’s point of view, there’s little difference, right? But under the hood, a lot of things are done differently.

Collections of primitives

If we can have an array of ints, how about a set? You likely already know what would happen if you wrote:

Set<int> myBagOfInts = new HashSet<>();

Yep, can’t do it. Won’t compile. So, you can’t use primitive types in collections. Or as any generic types for that matter. You probably knew that.

But imagine you could. Why not? What’s the problem? The problem is that… well, there are, in fact, many, many different problems. A primitive type is not an object. I know you knew that, but consider the implications for a moment. You can’t refer to it, as it’s just a value. Secondly, the collection’s a HashSet… And int is not an Object. How do you compute the hash code? There is no hashCode()! When you think about it, it’s a “there is no spoon” moment. And let’s not think too hard on the point of hashing an int, ok?

“But what if I could!?” – you could say now. “I could write an implementation of a HashSet for ints!” Indeed you could! It wouldn’t even be that much work! Oh, by the way, have you ever seen an IntStream class? Or a LongStream? Or a DoubleStream? Or, for that matter, Stream.mapToLong(), Stream.mapToInt()? I think you see what I’m getting at. Not too convenient, is it? The support for all the primitives has to be coded separately, there is no generic way to do it.

I hope that by now you see the trouble with primitives – the issues they cause for JVM programmers and language designers.

The path towards Valhalla

Imagine you could have a Point class with two doubles as fields. Imagine you could have a Polygon with a List of Points as a field. Imagine you could have a list of Polygons to use for rendering. And now imagine they are a coherent, unfragmented part of memory. A single cache load, no references, just a pointer in memory moving forward-only, byte by byte, to render the graphics. That’s Java’s Valhalla.

Still not convinced? Do you think it’s never going to make a difference? Take a look at Polygon class in AWT. It doesn’t store List of Point objects, it stores two arrays of primitive int arrays. Why? Iteration, memory access, object overhead, cache, registers, CPU instructions. In short: performance.

An image showing a viking ship.

Reaching Valhalla isn’t easy. Java’s special handling of primitives is hardcoded into JVM at a very low level. And, which is perhaps even more important, handling of value types poses a lot of questions. Let’s dive into them.

Project Valhalla introduces a new idea to Java, primitive classes. They used to be called inline classes, and that name actually held a deeper significance. These classes would indeed be very strong candidates for inlining, which in this case would mean storing their values in registers instead of treating them like normal objects. However, both approaches would be possible – it would be up to JVM to decide. The functionality of inlining stays, the name got changed.

Valhalla proposes adding two new restricted interfaces: IdentityObject and InlineObject. The first one would be used for plain old classes and the new one for primitive classes. This would allow the programmer to efficiently differentiate code handling one of these types. For example, there could be collections that accept <T extends InlineObject> generic parameter, which would be implemented specifically for handling primitive classes.

For primitive classes to work many questions have to be answered. How would an equality operator work? For objects, it compares references. For primitive types, it compares value. For primitive classes, there are no references and there are no nulls, so comparing them doesn’t make sense… But, to be somewhat consistent with current Objects, this operator actually could always return false.

Seems unintuitive, but at least it’s very easy to implement. And the alternative is to do a full value comparison. It was easy for – at most – 64-bit primitives, but what if someone creates a 4kb primitive class? Comparing all that memory byte by byte will be quite a lot of work. What if that primitive class contains references to “old” objects? Do we call equals() on them, then? Another option could be requiring the primitive class to implement its own special method that would be called when comparing using ==. But doing this doesn’t seem like a “Java way”, does it? Perhaps a marker interface could be introduced to do a by-value comparison and return false by default?

Speaking of nesting identity objects in primitive objects – is it ok? Well, imagine you have this memory structure that you can use to do high-performance operations, and in the middle of it, there’s a reference to an object located somewhere else in memory. Doesn’t seem to make a lot of sense, but on the other hand, why not? I’d prefer to have a compiler warning in such a case, though.

What about primitive classes’ instance size? Some of them, if nested in each other and using particularly large arrays, may grow very, very big. Would that be a problem for JVM? Possibly. But that problem already exists. If you try to create a large enough primitive array, Java faces the same issue. So, Valhalla may make the issue more common, but it’s already handled in JVM. If you go too far, you’ll get the plain old OutOfMemoryError.

Do we need Valhalla?

An image showing a quesiton mark.

We’ve discussed what project Valhalla is, what options it gives to a Java programmer and what issues it introduces. But do we need it at all?

Java does suffer from memory fragmentation. Objects hold references to other objects, they are often located in a completely different part of the heap. Accessing them is a huge bottleneck, as RAM is much, much slower than the processor. On current processor architectures, it takes a few hundred CPU cycles to access RAM, a few dozen cycles to access L3 cache, a dozen or so to access L2 and a few to access L1. With the memory being fragmented, it is often extremely difficult to populate all the caches properly.

This is where the strength of Valhalla lies. With a well-designed primitive class, the memory access can be properly optimized. Less CPU time is wasted, and overall throughput is thus significantly increased. In the early builds of JVM, using only SOME of Valhalla’s features, the results were very optimistic. A matrix multiplication implementation was 6 times faster with Valhalla and loop-summing was 12 times faster. These results come from a Brian Goetz’s talk delivered on July 29th, 2019. And, remember that this is just the very beginning. The real optimizations will likely come after Valhalla is functionally complete, possibly years after release.

A difficult road with no guarantees

The path to Valhalla is a long one. The first two JEPs were delivered in Java 11, two more in Java 15 and 16. The likely most important (and most difficult to implement) JEP 401 is in the candidate state right now. It may take years until it is developed or it may never get delivered at all. Only time will tell.

If you want to know more about other changes made to Java in recent years, check out my previous article: Java 17 features: A comparison between versions 8 and 17. What has changed over the years?

The name of this whole initiative – “Valhalla” – is what the land of fallen heroes was called in Norse mythology. The stories tell us that Vikings were able to reach it by dying in battle. Considering this, wouldn’t it be ironic if the change I write about in this article never happened? If the people working on it would turn out to be like those warriors of old, who fell in a glorious battle with a worthy foe?

Can they fail to deliver Valhalla? Given the complexity of the task, I fear that they will. However, seeing the potential in these changes, I sincerely hope they do not.

If you want to discuss further, or feel that something important is missing from this article, make sure to drop me a line on Twitter. Thanks for reading! Also, check out my other posts on the Pretius blog:

  1. JVM Kubernetes: Optimizing Kubernetes for Java Developers
  2. Java 17 features: A comparison between versions 8 and 17. What has changed over the years?
  3. Clean-architecture Java: How ArchUnit can help your application

Do you need Java-based software?

Pretius has a great team of Java Developers and great experience in developing enterprise-grade systems. If you need a Java development company, write us at (or use the contact form below). We’ll get back to you in 48 hours and tell you what we can do for you.

Additional reads

Below, you can find the list of the sources I used to write this article. Check them out if you want some additional information.

  1. State of Valhalla
  2. Valhalla object model
  3. Valhalla JVM model
  4. OpenJDK project Valhalla
  5. JEP 401: Primitive Classes
  6. JEP preview: Universal Generics