Shouldn't InputStream/Files::readAllBytes throw something other than OutOfMemoryError?

Christoph Engelbert me at
Sun Mar 12 14:53:43 UTC 2017

Hey Anthony,

The meaning is already overloaded, as "Cannot create native thread"
is also an OutOfMemoryError and in like 99% of the cases means
"Linux ran out of filehandles". The chance the OS really couldn't
allocate a thread for the reason of no main memory available is very
narrow :)


Am 3/12/2017 um 3:24 PM schrieb Anthony Vanelverdinghe:
> Files::readAllBytes is specified to throw an OutOfMemoryError "if
> an array of the required size cannot be allocated, for example the
> file is larger that 2G". Now in Java 9, InputStream::readAllBytes
> does the same.
> However, this overloads the meaning of OutOfMemoryError: either
> "the JVM is out of memory" or "the resultant array would require
> long-based indices".
> In my opinion, this overloading is problematic, because:
> - OutOfMemoryError has very clear semantics, and I don't see the
> link between OOME and the fact that a resultant byte[] would need
> to be >2G. If I have 5G of free heap space, and try to read a 3G
> file, I'd expect something like an UnsupportedOperationException,
> but definitely not an OutOfMemoryError.
> - the former meaning is an actual Error, whereas the latter is an
> Exception from which the application can recover.
> - developers might be tempted to catch the OOME and retry to read
> the file/input stream in chunks, no matter the cause of the OOME.
> What was the rationale for using OutOfMemory here? And would it
> still be possible to change this before Rampdown Phase 2?
> Kind regards,
> Anthony

More information about the core-libs-dev mailing list