One day i decided to measure video camera pipilene latency on my raspberry pi 3 with camera module v1. Conditions: 1920x1080x30 H264. Research question was: how much time elapses between first line of image stars to exposure and encoded h264 packet emerged in userspace from VideoCore.
To get decent precision i've built simple timer on led matrix, connected straight to raspi GPIOs. Given a 30 FPS target it's able to display distinct 33ms intervals within 200ms modulo. Here is a picture:
Timer thread updates leds every 33 ms, right at start of another interval. For example enabled leds l_3 and l_1 means it is 33 — 65 ms from ahother 200ms period start. Leds l_4 and l_2 shows 166 — 199 ms from another 200ms periond start. I'm aware l_3 and l_4 are excessive but that small flaw doesnt really affects anything.
Now i point camera at led matrix and start my measuring app wich collects frame along with time it was received. Later it decompresses and writes collected frames into jpeg files enhanced with timing info. This is what i get in the end:
We can see here what led matrix enabled l_4 and l_0 diodes. It means frame started exposuring at 100ms and finished at 133ms. Label at bottom contains: current local time in format s.us; current time modulo 200000 us and time elapsed since previous frame was obtained in ms. This example shows that it was received on 28.390588 second, local time. 390 % 200 = 190-th ms (second number is receive time in microseconds). 190 — 100 = 90ms latency. Considering distortion in time intervals beetween frames wich could vary from 28 to 44 ms worst case would be entire 100ms+. This is rather disappointing. I was prepared for some lag but 90ms seems way too many. Note this is not a matter of transmitting data somethere. It is latency within raspberry pi device. 90ms is the cost of receiving compressed frame from camera.
I use MMAL api for media pipeline. Most of configuration code is similar to example in raspivid.c . Camera buffers in opaque format, camera out port tunneled to encoder input port. MMAL_PARAMETER_VIDEO_ENCODE_H264_LOW_LATENCY configured, but i didnt noticed any difference. Also tried MMAL_PARAMETER_MB_ROWS_PER_SLICE but only effect of it was some videocore semaphores hanged in locked state forever (i am planning to create separate topic for the issue). OpenMAX api gave me similar results, though it doesnt support opaque buffers for some reason.
Now to questions. Is it even possible to achieve reasonable latency on rapi 3 which i believe should be no more than 50 — 60ms? Which API allows to achieve lowest latency: OpenMAX or MMAL? What is the right way to split uncompressed camera frames into slices along with compressed h264 packets (mmal api)? Could there be difference with camera module v2?