Blur effect on live scene?

Jim Graham james.graham at
Thu Aug 13 16:46:34 UTC 2015

One more complication to note - in some cases an application might want 
to blur an entire underlying node/pane.  In other cases an application 
might want to blur a piece of an underlying node/pane that lies behind 
an overlay node.

Those two cases would be accomplished in different ways with the current 
JavaFX APIs.  The first case would simply involve adding a blur effect 
to the underlying node (and possibly turning on its cache hint for 
performance if it is relatively static).  The second case could only be 
done by either using snapshot on the underlying nodes, or by recreating 
a duplicate copy of the underlying tree and setting a blur effect and a 
clip (and possibly cache=true) on the copy and placing that behind the 
overlying node...


On 8/10/2015 11:29 AM, Jim Graham wrote:
> Let me understand what is going on here.
> I get the result you are trying to achieve - blur the scene as
> background for something.
> I get that Mac and iOS seem to have direct support for this technique
> which appears to work faster than what we provide via the Effect mechanism.
> I also get that attempts to make it appear better via snapshot will
> unfortunately involve a copy to main memory to produce the "Image" object.
> If we had a "snapshot to texture" mechanism then that might reduce the
> memory copying of the work around technique.  I'd argue that we sort of
> do have something like that - it is the cache flag.  If a Node is cached
> then we do have a copy of it in a texture and that can help make the
> Blur Effect work more efficiently, but there may be some additional
> copies between textures if everything isn't set up right.  Still, that
> is an avenue for someone to check to see if there isn't a better way to
> achieve this effect in the short term...
> There is the implication that one can add a shader to an overlay texture
> that will cause it to have a dynamic "blurred lens" effect.  I'm not
> familiar with how that would be done.  AFAIK, shaders work on inputs and
> produce an output which is transferred to the destination using the
> pixel transfer functions and you can't source the destination pixels in
> the shader in order to blur them.  I would imagine that the Mac/iOS
> technique is done by sourcing directly from the back buffer into the
> overlay texture using a blurring shader.  That gives the overlay texture
> a solid background that is a blurred copy of the back buffer.  They then
> draw the overlay contents (menu bar?) on top of that blurred background
> data and transfer the overlay texture back into the scene.  The blurred
> vision you are seeing is not "the pixels being blurred through the
> overlay texture" but rather a very quickly managed "blurred copy of" the
> data in the underlying buffer.  If the scene changes, then the entire
> process would need to be repeated on the new underlying pixels to get a
> new blurred copy of them as background in the overlay texture.  I can
> also imagine that they may have more direct support for blurring (there
> is an OpenGL EXT_convolution extension which we do not use - using our
> own convolution shaders instead - which may potentially work faster than
> what we do).  Also, they may be a little more efficient at managing the
> buffers involved in that dynamic operation (our Decora Effect engine
> isn't necessarily set up to use the "back buffer" as a source and so
> sometimes we may have to render parts of a scene an extra time specially
> to make an input texture for Decora).
> If I'm understanding all of this correctly, then it seems that:
> - it may be time to investigate tighter integration of Decora and Prism
> texture mechanisms (they use the same underlying objects, but don't have
> a good way to share pre-existing textures with each other).
> - Node.cache may provide some additional short-term techniques for
> achieving this effect
> - EXT_convolution may help us with performance if it is available
> - if worse comes to worst, we might have to add "blurred parent" as a
> direct feature of a Node to streamline the processing of this effect,
> but hopefully we can get there with some combination of the above.
> Or, if someone can enlighten me on some techniques they may be using
> that I'm not familiar with - please do!
>              ...jim

More information about the openjfx-dev mailing list