Value type hash code

Dan Smith daniel.smith at
Wed Apr 11 22:21:51 UTC 2018

> On Apr 11, 2018, at 2:59 PM, John Rose <john.r.rose at> wrote:
> On Apr 11, 2018, at 9:40 AM, Dan Smith <daniel.smith at> wrote:
>> This is a reasonable language design, but I don't think the JVM should hard-wire that behavior. The compiler can generate the equals/hashCode methods it wants, and my thinking is that when the compiler fails to do so, the JVM's default (that is, the behavior of Object.equals and Object.hashCode if not overridden) should be conservative.
> (As usual we are confusing ourselves by having simultaneous but distinct
> discussions about what the JVM should do and what the language should do.)
> What Dan is saying here is what the JVM should do if presented with a naked
> value class, with no contents beyond what the verifier requires.  Not even Java
> 1.0 had completely naked classes, since javac would often spin up synthetic
> methods like <init> and <clinit>.
> What others are saying is that the language should make the default be
> something such as deepEquals or maybe substitutable or even "user
> responsibility" (a la abstract inherited methods).

Yes, this is right. And to clarify my motivation:

Object.equals/hashCode have some really nice properties that apply to all non-overriding objects (both class instances and arrays), including consistency over time, guaranteed successful completion, and guaranteed substitutability. These methods do this by being extremely conservative.

(I think a lot of people would say these properties are not very important, but I think they would be missed if Java had gone a different way and given every object a deep equals by default.)

Often, end users find this conservative behavior inconvenient, and would like to consider a wider range of instances equivalent. They are free to write some code that will have that effect, or use libraries that have written such code, and in so doing opt in to various risks if they're not careful about how they use their instances.

I think it would be a mistake for the JVM to force these risks on all users. The JVM has no domain knowledge, no way to determine whether a value is intended to be a hash key, store a cyclical object graph, etc.

It may make sense for the language to introduce these risks through certain implicit declarations, although I think we should think carefully about them. End users should understand the risks of using the implicit-generating constructs and have reasonable alternatives/opt-outs.

It's even more permissible for libraries to introduce these risks, although again it should be done carefully, taking into account what the library knows about its clients, and clearly advertising the risks. (If I were designing a general-purpose mutable collection API, for example, I would not have followed the Collections path of defining an unstable equals/hashCode method. Maybe I'd prefer to introduce a "currently equal" operation under a different name.)

And of course it's fine to allow end users to accept the risks explicitly in their own classes.

More information about the valhalla-dev mailing list