Posts: 93
Joined: Thu Oct 25, 2018 7:35 am

ffmpeg: A/V decode and render API supporting all Pis?

Sat Dec 21, 2019 1:32 pm

For my open source AirPlay mirroring receiver RPiPlay, I'm looking for a future-proof video and audio decoding and rendering API that
  • Uses the video decoding hardware
  • Supports different audio sinks such as HDMI/Analog/ALSA (DACs and USB audio interfaces)
  • Could easily be adapted to H265 should a need arise with a new AirPlay version
  • (Possibly also works on generic desktop Linux systems using whatever hardware acceleration available)
Currently, I'm using OpenMAX, which only ticks some of these boxes. The biggest problem currently is the lack of ALSA support. It seems like some hacky ways exist for adding ALSA support to the OpenMAX API (demonstrated by OMXPlayer), but I'd rather not invest time into developing a solution that only works with the Pi and uses an API that is known to stop working with 64bit Raspbian distributions.

In numerous places, I found recommendations for FFMPEG, which generally seems promising: The abstraction makes it so the same code works with various different input formats and decoding hardware. However, I completely failed at finding enough information regarding its API, so I hope someone can answer these questions I have regarding FFMPEG/LibAV on the Raspberry Pi:

1) Can FFMPEG/LibAV be used not only for decoding, but also for rendering video and audio? Is there an API to set up a pipeline similar to what's possible with OpenMAX, where after set up, I only need to provide H264 frames to the pipeline, which automatically renders the decoded frames to screen?

2) How does A/V sync work with FFMPEG/LibAV? It took some time for me to figure out how to use OpenMAX' clock component for syncing the video and audio rendering, video clock as reference and all, and I'm hoping a similar facility exists in FFMPEG/LibAV.

3) Does anyone know simple sample projects that demonstrate the concepts from 1 and 2 in FFMPEG/LibAV? The only material about the C API usage among the flood of command line tutorials for FFMPEG/LibAV only utilises the library for decoding, and renders with OpenMAX on the Pi.

I'd highly appreciate any hints!

Return to “Graphics programming”