I am recording h264 streams using tools like raspivid. Now, I want to play them back; ideally in an environment where I can use a decoded stream instead of the camera as input to my analysis code, for repeatability/playback/debugging.
I've picked apart the various "userland" raspicam utilities, and understand how MMAL interfaces with the hardware.
I see that there is a video decoder in the videocore:
#define MMAL_COMPONENT_DEFAULT_VIDEO_DECODER "vc.ril.video_decode"
However, I can not find any sample for how to use this. Presumably I wire the decoder to either a preview component, or to something that takes raw buffers and displays them. I can kind-of guess that perhaps I create a pool of buffers for the input data, and push those buffers at the decoder input port?
Will the pool mechanism make sure that the buffers go back in the pool when the decoder is done with them?
Do I need buffers to follow any particular boundary, or can I send 128k-sized chunks and the decoder will demux them as appropriate?
How does timing of the playback frames work; does the decoder do that, or do I somehow receive timestamps that I then need to schedule myself?
I can see the hello_video sample in the userland directory, but it uses OMX/OpenIL, which is different from the MMAL library.
Can I use OpenIL and MMAL in the same program? Are the buffers interchangeable, as in can I forward an opaque OpenIL/OMX decoded buffer to the input port of a MMAL component?