Decoding and Rendering to Texture H264 with OpenMAX on Raspberry Pi

Featured Video Play Icon

After accomplishing the target of decoding and rendering compressed image formats onto OpenGL textures directly using OpenMAX, I’ve looked into how to use the same concept to render h264 stream directly into a texture using the same OpenMAX component on my Raspberry Pi.
As a starting point I took the same sample code I used in this post, and the hello_video sample code by Broadcom in VideoCore.

This is the code I wrote to make it work (PiOmxTextures_2.0.tar.bz2 version 2.0, notice that this is only a collection of notes, which compiles and seems to work fine on the wheezy Raspberry Pi image, it is not a fully re-usable component. I’m still working on that.
The code is pretty messed up and much of it is not fully implemented. Error management is almost inexistent, but still it can be useful to guess how to make things work. The rest is up to you ­čśë
This is a video illustrating the sample code running:

To compile the code refer to this article. You will need the Qt 5.0 libraries (version 4.0 might be sufficient) running with the eglfs plugin. Hope this helps!

4 thoughts on “Decoding and Rendering to Texture H264 with OpenMAX on Raspberry Pi”

  1. I am trying to implement your code but I never get a callback from the callback set with ilclient_set_fill_buffer_done_callback

    I do however get one for the empty buffer (ilclient_set_empty_buffer_done_callback)

    any insight to what triggers the fill buffer callback?

Leave a Reply

Your email address will not be published.