joe.darcy at oracle.com
Wed Apr 11 20:00:26 PDT 2012
On 4/11/2012 7:45 AM, Rémi Forax wrote:
> On 04/11/2012 04:18 PM, Benedict Elliott Smith wrote:
>> Looking at the source code, it doesn't appear as though there is any
>> to require the input be a String - only length() and charAt() are
>> which are both declared by CharSequence. Is there a reason I can't
>> or was it an oversight? It would be nice to widen the type for this (and
>> potentially other equivalent methods).
> Integer.parseInt (1.0 or 1.1) pre-date CharSequence (1.4),
> that's why it use a String an not a CharSequence.
> If you don't want to break all already compiled programs,
> you can't just replace String by CharSequence because the exact
> signature of the method
> (with the parameter types) is encoded in the bytecode.
> Joe Darcy write a cool blog post on that .
That is a kinder description of the blog post than I would expect :-)
FYI, a fuller exploration of that issue in a broader context is written
> The best here is to add new methods that takes a CharSequence, move
> the code that use
> a String in them and change the method that takes a String to delegate
> to the one that use
> a CharSequence.
>  https://blogs.oracle.com/darcy/entry/kinds_of_compatibility
Remi and I have in the past had differences of opinion on the utility of
introducing CharSequence versions of such methods.
One benefit to using a string is that the object is immutable; there are
no time-of-check-versus-time-of-use conditions to worry about. Robust
code should arguably work sensibly even with mutable CharSequences, and
the easiest way to ensure that is to call the toString method of a
CharSequence passed as a parameter.
More information about the core-libs-dev