State of javac support for lworld-values.
srikanth.adayapalam at oracle.com
Mon Mar 26 10:30:35 UTC 2018
On Thursday 15 March 2018 11:53 PM, Karen Kinnear wrote:
> I sent a email yesterday, and just sent out a .pdf summarizing a new
> Summary - proposing that we add a clue (annotation?) for javac that a
> value-based class
> is migrating to become a value type, and allow javac to be strict with
> new value types
> and have a choice on handling migrating types. The JVM will continue
> to be lenient
> to allow migration.
As captured here:
I am still trying to convince myself that we need such lenient mode in
javac and that it is a
Does a lenient mode at the eventual release for ease of migration
purposes amounts to foregoing the opportunity to slap on the wrists but
the lead the application programmer down the path to blowing their
entire foot (with a runtime error) ?
(Has the VM considered not NPEing for a null reference assignment to a
flattenable field/value array cell, instead coalescing into the default
value ? Does that make sense ? Afterall, anewarray comes up with default
I will nevertheless share review comments on the proposal here:
(I do readily see that value in having the ability to discriminate
between a value-based class evolving into a __ByValue class and a brand
new value type insofar as the defaults for __Flattenable are concerned)
(Following enumerated points are excerpts from the proposal pdf with
1. forward migration: value-based classes that migrate to value types
note: this is a bounded problem - there are a limited number of known
value-based classes and probably a limited number of candidates for
Does this mean that we are looking at only such migrations/evolutions of
the JDK classes ? Or do we need a more general mechanism that would
accommodate the migration/evolution of value-based classes in third
party land too ?
If it is only the JDK, then yes, it is a bounded problem and we can
enumerate the classes into javac for special treatment. Otherwise we
will need a more general mechanism.
a new modifier__ByPureValue(brand new value type) in addition to
__ByValue (value based class morphing into a value class).
An annotation @PureValue
2. value-based class migrations to value class: e.g. FIELDC
I generally found the labels/nomenclature a bit unintuitive. Should
FIELDC have been named CLASSC ?
(What is the suffix C ?)
(3) aastore/aaload: - NPE if attempting to store null to a flattenable
value type array/ if loading null from a flattenable value type array
The case of NPE on aaload cannot occur ? (as no store would write null
value and anewarray and its cousins would not create a value array with
(4) The ideal world for javac is to make all value types flattenable by
default, whether in fields or arrays to be able to give early warning at
compile time for nullability to be able to insert null checks in the
Javac does not insert null checks for any bytecodes other than checkcast
as of now. I doubt that it needs to - Also see (7) below.
(5) new value types javac could make all new value types flattenable by
default, whether in fields or arrays
Yes, I can see how this point calls for a way to discriminate between
value-based class evolving into value type and a brand new value type.
(6) because value types are final, and we disallow conversion from a
value type to an identity object:
Not sure what "we disallow conversion from a value type to an identity
object" means. One can certainly assign a value instance to an object or
(a super) interface type ???
(7) and javac can inject null checks before bytecodes when it knows it
is always dealing with a new value type - e.g. withfield, aastore
aastore: I don't think it makes sense to insert null checks for aastore.
Because the VM *anyway* has to check for null stores and throw an NPE.
So wouldn't this amount to duplication of effort ? The VM can't forego
the null check at aastore because it will be required for value based
classes clients that are not recompiled ?
withfield: I can insert a null check, but the JVM spec calls for a null
check at the VM level.
checkcast: already inserted.
instanceof: I can insert - but would that be a bad idea ? Is it prudent
to just evaluate to false rather than throw NPE for null instanceof
(8) goal that where possible, javac issue a warning where runtime would
throw an exception, as is done today for ClassCastExceptions
If it can be established at compile time that a given construct would
result in runtime error, then a compile time *error* is called for I
would say - not just warning. I think you are referring to the unchecked
warning regarding a potential CCE. But these are only potential runtime
errors, not guaranteed runtime errors and so a warning makes sense there.
(9) value based classes are "supposed" to already not assume identity,
so we expect fewer surprises there
(10) javac would disallow calling java.lang.Object methods that do not
(11) nullability handling for migration
I will explore with the team whether it makes sense to add a lint mode
for javac in JDK11 that will diagnose various problems with value based
In summary I can readily see two requirements:
(a) A way to discriminate between a brand new value type and a value
based class metamorphosing into a value type - so that flattenable
defaults can be inverted.
(b) A lint mode in javac in JDK11, well before value types see light of
day that could alert users of coding patterns that would cause the
trouble in future migration to value types - though this mode I am not
sure won't be interpreted by the community as an implicit
promise/guarantee of sorts that value type will be coming soon.
With more arguments, I can convince myself that
(c) a lenient mode is indeed called for.
(d) Additional null check insertions are required.
Have I omitted anything ?
More information about the valhalla-dev