Embed VLC video output into a Scene Graph

John Hendrikx hjohn at xs4all.nl
Tue Jul 17 14:49:25 PDT 2012

On 17/07/2012 19:20, David DeHaven wrote:
>> I'm trying to embed a VLC player in a JavaFX 2.2 application.
>> For that I'm using the libvlc.dll library's API.
>> VLC provided two ways to render a video:
>> 1. Call API function that takes HWND: libvlc_media_player_set_hwnd (libvlc_media_player_t *p_mi, void *drawable) Set a Win32/Win64 API window handle (HWND) where the media player should render its video output.
>> 2. Register a callback that gets called every time a frame should be rendered: libvlc_video_display_cb (void *opaque, void *picture) Callback prototype to display a picture.
>> With the HWND approach, I managed to extract an HWND pointer of the Scene using Glass framework, causing the video to render on the whole scene, no other nodes are visible.
>> With the Callback approach, and by using JavaFX 2.2 new javafx.scene.canvas.Canvas node, I was able to throw pixels at the canvas, but it turns to be very inefficient in terms of memory and thread usage.
>> I would love to have a solution as with AWT + JNA: the HWND can be extracted from java.awt.Canvas component (using JNA), and be given to libvlc API function. This canvas can be placed anywhere in the frame, not occupying all the entire window.
>> Hard Guidelines:
>> 1. Integrating AWT/Swing with FX application is not an option, I want it to be a pure JavaFX application.
>> 2. Using JavaFX Media API for playback is not an option, since I have a requirement to modify the media player's source code.
>> 3. The video must not occupy the entire window, allowing other components to be visible.
>> Possible solution is to create my own node or control that will wrap Glass component View that has a HWND but I don't know how to bind and make it work with other Glass components.
> If you start digging into private API's you're dooming your app to crash and burn at the next update, especially true for 3.0 where a lot of under the hood changes are expected to be made.
If there was a solution that would allow full 1080p playback of popular 
containers like avi and mkv that would require a little bit of hacking 
but break guaranteed when 3.0  is released, I'd take it.  As it is, 99% 
of videos need to be transcoded before the standard media framework will 
play it.  Hence the need for alternatives, VLC / Gstreamer / Xuggler / 

> The canvas approach is interesting, but wasteful as it allocates far more resources than necessary.
Yup.  All  we need though is direct access to a texture or heavy weight 
component, preferably with a HWND so embedabble players can tell their 
output to render directly to it.
> If you get a raster from VLC, then just create an image from the raster. That's what Jasper did for JavaOne last year with the video preview portion of his Microsoft Kinect demo, just create a new image for each frame (sounds wasteful, I know, but not really much different than how MediaView works internally). When you create an image, it gets uploaded to a D3D/GL texture if you're using hardware rendering so the actual rendering doesn't suffer any performance drawbacks.
This is also wasteful, so much so that you can forget smooth playback of 
the format that matters at this moment, full hd.  Throwing around 
multi-megabyte textures at a rate of 25 frames per second is in my 
experience not really feasible without a considerable performance 
penalty -- that's why they stopped doing that 10 years ago and created 
> I think you may need to look at WritableImage for this as I'm not sure if the method used for JavaOne is public or even exists any more… WritableImage allows you to change the image contents on the fly, so it should provide what you need without all the overhead of a Canvas node.
> What I'm more interested in is why you need to use VLC? What's missing in the media stack that you need? File feature requests in JIRA and we can investigate including them in future releases...
Well, for me, VLC is  needed because it actually plays the videos I 
throw at it (avi, mkv, mpg being the three most common containers).  It 
also plays streams, like you can get from YouTube (through a simple 
Google API).  Conversion is not an option -- it needs to play the files 
users have available to them.   What if Java could only decoded PNG 
image files?  I cannot expect users to transcode all of their images so 
my app can read them.

As far as I'm concerned the current MediaView is a neat trick to be able 
to say that JavaFX supports Video playback, but in reality it is only 
useful in a small niche where you are providing or encoding your own 
videos in the correct format (or if you enjoy watching Big Buck Bunny).

So we're left with hacks, trying to use Canvas, or WriteableImage, 
embedding Swing, overlaying other Windows on top of ours, obtaining 
HWND's in the hope that a player can render its data in the background 
of a Stage with JavaFX nodes on top... or as I have done, open two 
Windows (awt Frame + Stage on top of it), make the Stage transparent and 
have full hardware accelerated video render to the Frame.

I certainly wouldn't mind a better solution, there is lots of nice 
things I could do in my app if the Video was a Node like the rest of the 
visible controls... so I'm hoping that eventually the gstreamer lite 
stack can simply be replaced to play back all the common formats, but 
I'm afraid it's not going to be happening in a reasonable time frame.


More information about the openjfx-dev mailing list