Feedback: Using pattern context to make type patterns more consistent and manage nulls

Stephen Colebourne scolebourne at
Sun Jan 24 14:18:25 UTC 2021

When nulls were last discussed I expressed reservations about how they
were intended to flow through switch/patterns. It looks like the
latest approach is an improvement.

"So the alternative idea to explore is: make `case null` the _only_
nullable case, but relax fallthrough to allow falling through from
`case null` to a type pattern,"

  case Object o: // non-null
  case null, Object o:  // accepts null

This looks like it would be much better than the original proposal. It
argues that null-acceptance derives from the context, with the
proximity of `instanceof` and `case` driving null rejection. However,
I think it is possible to do better by following a slightly different
context-driven mental model.

Based on recent discussions it seems like pattern contexts will become
a significant concept in the language. I think pattern contexts can be
used to make type patterns work more consistently. Indirectly this
allows developers to manage null in patterns.

It seems eminently reasonable, defendable and teachable to say that
`var o` and `Object o` behave differently in a pattern context to a
non-pattern context. I argue that this is more consistent than carving
out a narrow rule that says a type pattern behaves differently when
prefixed by `instanceof` or `case`.

My proposal is that within a pattern context:
* `Foo o` requires the target to match the type Foo as per instanceof
* `var o` matches any type, inferring the type of the local variable

The implication for null is that:
* `Foo o` o rejects null in pattern contexts
* `var o` accepts null in pattern contexts

As can be seen this really this isn't about null, but about making
type patterns actually work consistently. If the type pattern says
`Number n` it should always mean `n instanceof Number`, whereas today
it depends on whether the target of the pattern is Number or Object
(which isn't necessarily visible in the code) and rules around sealed
types, enums and totality (all very confusing). My proposed rule is
absolutely simple and consistent. It results in a huge improvement to
the readability of patterns - if a type is specified in the pattern
then it is matched, no ifs or buts. It focuses on matching the type of
the target, not the type of the method overload (which is a lot
simpler for the developer to comprehend at the use-site). The null
accept/reject outcome is secondary.

This unifies behaviour between instanceof and switch:

 if (x instanceof Object o) ... // o is non-null because it is in a
pattern context
 switch (x) {case Object o ... } // o is non-null because it is in a
pattern context

This also unifies behaviour between the root and nested levels:

 if (x instanceof Box(Object o)) ... // o is non-null because it is in
a pattern context
 switch (x) {case Box(Object o) ... } // o is non-null because it is
in a pattern context

But this does not change any of the recent best practice code which
uses `var` when accepting "everything else":

 if (x instanceof Box(var o)) ... // o may be null
 switch (box) {
   case Box(String s) -> ...  // s is not null
   case Box(Number n) -> ... // n is not null
   case Box(var o) -> ...  // o may be null

What will actually happen is that the totality requirement of
expressions will push developers to use `var o` instead of `Object o`.
Which is a very good thing for readability as `var` will be a helpful
indicator of totality. But it also gives developers a new choice to
insert `case Box(Object o)` and guarantee a non-null object. For
example, the following expression switch would not compile because the
developer has not expressed what should happen with `Box(null)`:

 var result = switch (obj) {
   case Box(String str) -> ...
   case Box(Object o) -> ...

The developer would either need to add a default clause, add a
`Box(var o)` clause at the end, or change from `Box(Object o)` to
`Box(var o)` (most likely). The advice to developers is to always use
`var` in patterns unless explicitly matching against a type.

I know that there is a desire to retcon a local variable declaration
to be a pattern. But I haven't broken that desire with anything I've
said above! A local variable declaration can still be thought of as a
pattern, but one that is *not* in a pattern context. A type pattern
outside a pattern context would not require instanceof-like matching
and thus would accept null, as today:

 Object o = someMethod();  // o may be non-null because the pattern is
not in a pattern context
 var o = someMethod();  // o may be non-null

And if a keyword like `let` was added to start a pattern context you would have:

 let (Object o = someMethod()) {...}  // o is non-null, because it is
in a pattern context
 let (var o = someMethod()) {...}  // o may be null

With a nested pattern local variable there *is* a pattern context
(implied by the nest). So the desired short syntax is still possible:

 Box(Object o) = someMethod();  // does not compile, it is not total
 Box(var o) = someMethod();  // o may be non-null

In summary, I've outlined how pattern contexts can be used to define
more consistent type pattern behaviour with big readability benefits.
The type pattern `String s` would always behave like instanceof
(rejecting null), whether at the root or nested, so long as it is
within a pattern context. The `var o` pattern doesn't specify a type
to match thus it naturally accepts null, meaning that all the best
practice code examples still work. Local variables are not in a
pattern context, so work as they do today.


More information about the amber-dev mailing list