value type hygiene
john.r.rose at oracle.com
Tue May 15 21:26:59 UTC 2018
On May 15, 2018, at 5:06 AM, Maurizio Cimadamore <maurizio.cimadamore at oracle.com> wrote:
> I wonder if we shouldn't also consider something along the lines of restricting migration compatibility only for _nullable_ value types, where a nullable value type is a value whose representation is big enough that it can afford one spare value to denote null-ness. So, if you want to convert existing legacy reference classes to value types, they'd better be nullable values; this way you don't lose any value in the legacy domain - nulls will be remapped accordingly (using a logic specified in the nullable value type declaration).
> It seems like we've been somewhere along this path before (when we were exploring the Q vs. L split) - why would something like that not be workable?
We do something similar explicitly (not globally) with the combinator
MHs.explicitCastArguments, which converts null to zeroes of
primitive types. But not vice versa: Zeroes don't re-box to null.
And it's a localized thing.
I think you are suggesting a def-site opt-in where a class VT somehow
nominates a special value VT.N (perhaps its VT.default default, perhaps
not) with one or both of these behaviors:
assert( (VT)null == VT.N ); // unbox null to N
assert( (Object)VT.N == null ); // box N to null (probably not!)
For simplicity, let's say N must be VT.default, and that the conversion
is just one way (null to VT). Then the opt-in could be as simple as
mixing in an interface NullConvertsToDefault. See the P.S. of this
(Perhaps you are suggesting something different?)
It might be workable. It's complicated, of course. It would need to inject
special logic into many paths in the JVM which are currently just error
paths. There might be collateral damage on performance. For optimized
code, throwing an NPE is always simpler than patching and continuing.
This is roughly because, in optimized code, control flow merges are harder
to optimize than control flow forks.
(Adding in the symmetric feature, of converting N to null, is probably
very expensive, since it would seem to require lots of new tests of
the form, "are you N?" And there's little value in converting N to
null and then having List.of or some other null-hostile API throw
an error. So then you have a puzzler: The seam between N and
null is not completely hidden.)
AFAIK C# does something like this as a one-time deal, which cannot be
mixed in as an interface:
In C#, at the use-site of a type you can opt into it with an emotional
type like 'int?'. But we could make it opt in at the def-site, too,
if the value type has a spare code-point it's not using. I'm sure
if C# doesn't do this there are excellent reasons for them not to.
Anybody got information on this?
I am hoping to avoid playing such a card. (In case anyone didn't notice,
we are playing with a large deck here, if not a full deck. There are
lots of potential moves we can make.) I want us to win with a small
number of moves.
P.S. Other examples of moves: Adding another type descriptor, having
one array type be polymorphically boxed or flattened, having two
VM-level types per source value type, waiting for reified generics, waiting
for primitive convergence, adding large infrastructure for migration.
Maybe we will be forced to do one or all of these before we can
get anywhere. I hope not; I'm trying to sneak across a meaningful
waypoint (not finish line) simply with L-world.
More information about the valhalla-spec-experts