Accelerated Video Decoding and Rendering in QML with Necessitas Qt 4 for Android
Ok, I did this a couple of months ago but I just realized it might be of help to someone who is currently using Necessitas Qt4 for some project and still cannot use Qt5.
This is a sample code which shows how to create a custom QML component in the Qt4 Necessitas porting to use hardware acceleration on any Android devices with API level at least 11. The result is pretty good, you can check the demo I uploaded on youtube a couple of months ago (the third application shown is the one which is implemented over Qt 4):
The description says it all: “The third sample code uses a custom QML component written in C++ using a Qt 4.8.2 port for Android (Necessitas). Regular QML animations are then applied to the custom component. The semi-transparent image is a regular QML Image element with alpha set to 0.5.”
The code is available now here on github: https://github.com/carlonluca/TextureStreaming. The project TextureStreaming can be opened with Qt Creator and run on a device (assuming API level constraint is met).
Take into consideration that Qt guys are working on the QtMultimedia backend for Android, and I think it should be available in Qt 5.2. You might want to try that also: http://qt.gitorious.org/qt/qtmultimedia/trees/dev/src/plugins/android.
How it Works
As you can see from the code, a custom QML component is used and placed in the QML scene. That component instantiates some Java classes through JNI glue code and use the Android standard Media Player to start decoding video and playing audio. The sink is set to be a SurfaceTexture instance, which provides the OpenGL texture that the custom QML component renders in the QML scene. Result is pretty good.