I'm trying to synchronise OpenMAX audio and video playback of a protocol that only has occasional audio data (The AirPlay mirroring protocol, to be specific). The protocol has a video stream that is periodically delivering video data packets, but only sends audio data when any audio is actually playing on the device.
Thus, I'm trying to use the video clock as the reference clock for the audio renderer. I've tried the obvious solution of setting OMX_IndexConfigTimeActiveRefClock with an eClock of OMX_TIME_RefClockVideo, but that doesn't seem to work. The video still only starts to display when audio playback is started, although my program has been feeding in video buffers before. Once audio data is supplied, I see the renderer fast-forward through all frames I fed in until it catches up with real-time.
Also, I found this post here on the forum that seems to suggest the Raspberry Pi's OpenMAX audio renderer implementation only supports acting as the clock master: https://www.raspberrypi.org/forums/view ... 9#p1453426. If this is really the case, I'm wondering how a program is supposed to use audio as the reference clock if there only is audio data occasionally?
Any help is highly appreciated!