A library for implementing equals and hashCode
kevinb at google.com
Thu Apr 25 21:54:20 UTC 2019
On Wed, Apr 24, 2019 at 5:39 PM Stuart Marks <stuart.marks at oracle.com>
By "quality" hash code, do you mean a cryptographic hash?
No, definitely not. I just mean things like bit dispersion/avalanche and
better (sub-crypto) collision resistance.
The root problems I'm aware of with Object.hashCode() being pulled beyond
its core purpose (providing bits for HashMaps to stir up and use) are:
* It can't be seeded
* Compositing a hash code from a tree of data has to keep collaring down to
32 bits over and over, putting the burden on "everyone" to know how to do
* You can't get away from the preponderance of mediocre functions we're
already stuck with, as noted
I'm all for implementing something better than times-31-plus. AutoValue
happens to have been using times-1000003-xor for most of its lifespan, and
I wouldn't be at all surprised if there's a better default choice. The only
thing I'm arguing against is letting users customize the function; I think
that would be a mistake.
Other applications might want to use this API for convenience, or cleaner
> or something, but which might want to use their own hash reduce function
> order to preserve compatibility.
Of course, by all that is sensible, either you should never have specified
your hashCode() behavior or your users should never have depended on it.
Someone did something wrong -- which isn't enough reason to tell those
people "you're screwed!", but it seems totally reasonable to me that if
you're stuck with your old hash function definition, then you're just stuck
with your old hash function implementation too. That's not being screwed,
that's just keeping on doing what you were doing.
Kevin Bourrillion | Java Librarian | Google, Inc. | kevinb at google.com
More information about the amber-spec-observers