enhanced enums - back from the dead?
kevinb at google.com
Wed Dec 12 17:38:13 UTC 2018
Thanks Brian, that helps.
For example, the idea of switching on a compile-time constant has always
been conceptually 100% orthogonal to the type of that constant, and it
really will be simpler when one day we can just switch on anything. That's
a nice simplification* regardless* of whether anyone even uses the new
ability. Anyone who wanted to switch on a long would expect it to just
work, would have zero confusion about what the code meant if they
encountered it, etc. It becomes easily understood as something that *always
should have* worked -- there were just random "holes" in old versions that
eventually got filled in. It's the best case.
Thanks to your explanation I see that this case is like that one in theory.
It's not *quite *the same thing in practice. That is, I know that a person
can both read and write "switch (someLong)" with no puzzlement, but I don't
think the same is true here; enumness and genericness are
but most people can only come to that understanding through a bit of
confusion first, and they can't necessarily guess the right syntax to use.
Secondly, it's hard to imagine there being any unforeseen consequences of
switching on a long. Here, I don't really know how to think it through yet.
I would appreciate any "here's how we know that the changes we need to
support this will have no user-visible consequences at all except for X, Y,
and Z" type of explanation we have (apologies if I missed it).
On Wed, Dec 12, 2018 at 9:01 AM Brian Goetz <brian.goetz at oracle.com> wrote:
> Yes, you could spin this feature as “its sort of for experts”, note the
> incremental complexity, and then raise the “why bother” flag. That’s a
> valid question.
> But, I prefer to look at this feature in a different way. This is not,
> IMO, just “let’s make enums do more.” This is about _simplifying_ the
> language model by removing gratuitous interactions between features. Enums
> are a constrained form of classes, one whose instances are singletons
> managed by the runtime. Because the user gives up instance control, we’re
> able to reward the user with things like equals/hashCode/toString,
> serialization, etc. That’s a good trade. Enums can still use most of the
> things that classes have — fields, methods, constructors, interfaces. But
> there’s no reason they can’t be generic, and allowing that would reduce the
> inessential differences between enums and classes.
> The other asymmetry is newer; since we inferred sharp types for anonymous
> classes when we did LVTI, inferring a weaker type for enum constants is now
> an asymmetry (one we were aware of when we did LVTI, but the plan all along
> was to align these.)
> I know that I personally have run into both of these limitations at least
> once in every large project I’ve ever done. You start out with an enum,
> for good reasons, then you hit the limits, and then have to refactor to
> classes, manually inflating the instance control boilerplate. Its
> frustrating, in part, because its unnecessary. There’s nothing
> inconsistent about generic enums; it’s just an accidental “pick one”
> constraint that you discover when you find yourself wanting both.
> So, I prefer to look at this feature as “regularization” or “removing
> gratuitous interactions”, rather than “making enums more complicated.”
> (It’s in the same category as some other things we’re considering, such as
> allowing local methods, or refining the awful rules about static members in
> nested classes.) All of these are accidental sharp edges, that create
> unnecessary cognitive load for users to keep track of which features can be
> used in conjunction with which others.
> On Dec 12, 2018, at 11:40 AM, Kevin Bourrillion <kevinb at google.com> wrote:
> On Tue, Dec 11, 2018 at 12:25 PM Brian Goetz <brian.goetz at oracle.com>
> This uber-conservative approach seems a pretty reasonable approach to
>> me; after all, enums are a language feature, and Enum<T> is a
>> constrained class (can't implement it directly), so it is not
>> unreasonable to define its typing wrt its supertypes specially.
>> So, let's get back to Maurizio's original question, which is: At one
>> point, we thought this feature was pretty cool, and invested some work
>> in it. Then we ran into a roadblock, and wrote it off. Now, we've got
>> a reasonable way to clear the roadblock. Which brings us back to:
>> - Do we still like the features described in JEP 301?
> What proportion of enum use cases benefit from this? Offhand, I would have
> to guess that it is less than 0.1% (and if it turned out to be *far* less
> it wouldn't *shock* me). Does anyone believe it's *likely enough* to be
> >0.1% that it's worth my effort to try to research that question in our
> codebase (which would take a bit of work)?
> If not, and we can stipulate that the need is rare, this means it will be
> a very special tool for very special people; a dark corner in the language
> feature set that most Java developers will never have need to know -- until
> they suddenly encounter code that uses it, upon which they need to invest
> an amount of effort understanding what is going on, even though they may
> have 14 years of believing they understood enums under their belts already.
> And if the need is this rare, this effort they put in might never be fully
> paid back.
> On the flip side, for a developer who does have this use case, and could
> benefit from the feature, what are the chances that developer will even
> know about it? It may be so special-purpose that we have no real reason to
> assume they'll know what to do.
> On top of this, it seems the feature apparently has a blast radius onto
> aspects of enum design that have previously been stable.
> Can a 0.1% use case ever really be worth this?
> Kevin Bourrillion | Java Librarian | Google, Inc. | kevinb at google.com
Kevin Bourrillion | Java Librarian | Google, Inc. | kevinb at google.com
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the amber-spec-experts