defender method syntax considered harmful

Per Bothner per at
Wed Sep 29 17:41:27 PDT 2010

On 09/29/2010 03:55 PM, Alex Buckley wrote:
> On 9/29/2010 11:10 AM, Per Bothner wrote:
>> So the syntax needs a lot stronger justification.  That this would
>> "violate the long-held rule that "interfaces don’t have code"" doesn't
>> mean anything - the whole point of this change is to *change* this
>> rule so that interface *can* have code.
> The goal is source-compatible interface evolution, not "method bodies in
> interfaces" per se.

A related but slightly different goal is to support "fat" interfaces
without requiring every implementation to implement all of the methods.
That avoids (or at least reduces) the need for the common anti-pattern of
using an interface Foo paired with an abstract class AbstractFoo.

> Now, a good way to get source-compatible interface
> evolution is via defaults for abstract methods. Specifying defaults by
> value (i.e. method bodies) is plausible, but with the introduction of
> method references in the language, it is also reasonable to think of
> specifying defaults by reference.

The above also applies if you replace 'interface' by 'abstract class'.
Are you proposing to allow specifying by reference the defaults of
a methods in an abstract (or even concrete but non-final) class?
That might actually have some use, I suspect.

>> So that leaves "by specifying the default by name rather than by
>> value, it is easier for the runtime to identify whether two defaults
>> are in fact the same method, which is important for conflict
>> resolution." I'm missing something there:
>> The runtime determined whether two defaults are the same method - by
>> seeing if they're the same method: Defined in the same interface, with the
>> same name and parameter list.  Is it because you might have two interfaces
>> that might want to have the same default method?  That would seem a
>> fairly rare use case, and can be easily solved by adding a new
>> super-interface.
> Fairly rare? Maybe, or maybe not.

Given the lack of real-world examples, I'm leaning to "fairly rare" ...

> In general, the interfaces implemented by a class do not all define the
> same abstract method. If they do, then the interfaces are likely to have
> a common superinterface - what I might call a "hyperinterface". A
> hyperinterface is an important high-level interface with not only many
> possible implementations but also many possible subinterfaces. A good
> example in the Java SE API is java.util.Collection.
> So, incompatible defaults are only going to be a problem where a class
> implements interfaces commonly under a hyperinterface. Because
> hyperinterfaces are few in number but rich in scope, it's certainly a
> problem to worry about. Yet, it's quite conceivable that subinterfaces
> of a hyperinterface will independently choose the same default for a
> given abstract method, especially high up in the interface hierarchy.

I don't think "conceivable" is good enough.  We need examples.
E.g. we might want to add a new method to Collection.  Perhaps Set and
List could use the same default implementation, while Map uses a different
default.  Then somebody defines a class that implements both Set and List.
We want that to be OK, since they use the same default.

The problem is I can't think where that might make sense.  Presumably
if you add a method to Collection, you need to add a default for it.

> We want to exploit that possibility. In finance terms, we get a good
> increase in alpha with minimal increase in beta.
> You mention adding a common superinterface where interfaces wish to
> choose the same default for an abstract method. That more or less
> assumes global recompilation.

I'm missing something.  The JLS says "Changing the direct superclass
or the set of direct superinterfaces of a class type will not break
compatibility with pre-existing binaries, provided that the total set
of superclasses or superinterfaces, respectively, of the class type
loses no members."

> But the challenge is precisely to preserve
> type soundness _under separate compilation_. That is, if a consumer
> (caller or implementer) can be compiled and run against a library, and
> then you add a default to a method in an interface in the library and
> recompile only that interface, does the consumer still compile and run?

Why not? Note that the problem we're discussing is when adding the
*same* default to *two* different interfaces, where neither extends the
other.  We still haven't seem an example of why you might want to
do this - or why (if you do find such an example) that adding a
(possibly-artificial) hyperinterface would break binary compatibility.

(You deal with this stuff all the time, so it may seem obvious
to you, but I think an explanation for the list would be useful.)

> You don't get to recompile every class that implements the interface or
> calls it.
	--Per Bothner
per at

More information about the lambda-dev mailing list