alnaseh
Posts: 60
Joined: Thu Jun 23, 2016 5:12 am

open discussion on the camera latency

Mon Jun 26, 2017 2:04 pm

Hi,
I'm trying to stream live feed from the raspi camera using gstreamer over wifi. i ended up with the best pipeline i can ever have with a round 200 ms. this is considered big for some application.

so i tried to segregate the troubleshooting process into phases:
- calculate the latency without gst locally.
- calculate the latency with gst but display locally.
- calculate the latency with gst with network.

How to calculate the latency.
it is very simple. i have stopwatch with millisecond accuracy. i have my camera facing this stopwatch. it displays on the screen. i took a snapshot with both the real stopwatch and the screen showing the stopwatch. i got two readings on the same snapshot and calculate the difference. in the same style as in this image:
http://www.dronetrest.com/uploads/db529 ... 5eba12.jpg


now the result is surprising to me for the first scenario. the latency is 120 ms with the following command:
raspivid -vf -hf -t 0 -w 600 -h 450 -b 1200000 -fps 30 -pf baseline


this is really huge considering im just displaying it on the screen using only raspivid without any other processing. am i doing something wrong here. the display is on HDMI. Rasp pi 3. latest OS.

appreciate any hints

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7008
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: open discussion on the camera latency

Mon Jun 26, 2017 4:21 pm

What exposure time are you running? What refresh rate on your HDMI monitor? Where in the image is your stopwatch?

The Pi camera is a rolling shutter camera. That means that lines are read out sequentially.
- At time N, line 1 will start exposing on the sensor.
- At time N+(exposure time) line 1 will start reading out. exposure time is typically in the range 1-33ms depending on lighting. Assume 33ms unless you have very good lighting.
- At time N+33ms+(frame readout time) the last line of the image will be available. Frame readout time depends on the mode. I'd guess you're using sensor mode 4, so either 1/40 or 1/48 second (25 or 20.08ms) for V2 and V1 sensors respectively.
- Generally the image processing can keep up with the sensor in real-time, so <5ms after the last line is available it will have completed the frame, so at time N+33+25+5ms.
- The rendering then gets given the frame. It queues it for display on the next v-sync from your monitor. Assuming you're running at 60Hz, then that could be up to 16.7ms, but take an average of 8.4ms. N+33+25+5+8.4ms.
- The image is sent out serially over the HDMI port. Transfering the whole frame will again take up to 16.7ms at 60Hz. Now the question becomes how your monitor handles the update. Does it wait until the whole frame is received before updating the entire panel? Almost certainly otherwise you'd get rolling shutter effects on the display. Therefore it does have to wait for the entire frame.
N+33+25+5+8.4+16.7ms = N+83.1ms.

The back end of that is irrelevant as you want to stream the data, so any rendering delays are irrelelvant. The raw image processing latency has already been measure and studied - viewtopic.php?f=43&t=153410
Using "raspivid -w 1280 -h 960 -o /dev/null -n" I get a latency of 46-48ms. Switching to 1080P I get 80-81ms.
That is measuring the latency between the frame start interrupt (ie end of exposure for the first line) to the ARM getting the encoded buffer. That's probably a more useful number than the one you measured.
(Using my description that is ignoring the 33ms exposure time, but 25ms for frame readout, 5ms for finishing processing the frame, and then 16-18ms to encode 1280x960)


A read of http://picamera.readthedocs.io/en/lates ... -operation wouldn't go amiss to get these concepts around the sensor clear in your mind.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

alnaseh
Posts: 60
Joined: Thu Jun 23, 2016 5:12 am

Re: open discussion on the camera latency

Mon Jun 26, 2017 4:58 pm

thank you for your reply. there are a lot of details in your reply that i need to *read* about them as im not expert on this.

just to reply to your queries:
- Camera V2
- HDMI details:

Code: Select all

[email protected]:~ $ /opt/vc/bin/tvservice -s
state 0x12000a [HDMI DMT (85) RGB full 16:9], 1280x720 @ 60.00Hz, progressive
- you can find the screenshot here:
https://drive.google.com/open?id=0B_RQO ... Nkd2t6cGEw

from your explanation, it seems the latency that im getting is not too far from the expectation. okay it seems 20ms is out of scope here.

Mauronic
Posts: 3
Joined: Thu May 02, 2019 10:46 pm

Re: open discussion on the camera latency

Tue Jun 18, 2019 3:52 pm

I have been talking to several engineers to help me design a very low latency (120 - 140ms) wireless video pipeline for a robotics project.

As a first step I ran benchmarks on a RPi . Without trying too hard, you can get ~200ms glass to glass.

6by9's, your post was very helpful and helped me think about some optimization ideas.

It looks like the exposure and readout takes a significant amount of time. I am not 100% sure why.

Is there a camera / interface option that will allow immediate readout so we get each line as scanned without waiting for the exposure to finish? Or is this how it works now?

It was suggested to me to consider a very fast (and inefficient) encoding like MJPEG to get the data over WiFi ASAP. In this case I would budget a max bandwidth of 4 - 10MBps.

I did some research on this and found a company called Arducam that sells standalone camera modules and various camera boards. Could this hardware be leveraged in some way?

User avatar
HermannSW
Posts: 1308
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: open discussion on the camera latency

Tue Jun 18, 2019 4:14 pm

If you try to do a gstreamer streaming solution, take jitterbuffer into account:
viewtopic.php?f=43&t=204921&p=1273950#p1273950

This project allows for <0.1s latency:
https://github.com/131/h264-live-player

I use uv4l a lot, you can stream camera as well as HDMI output (these days my standard usage, I took videos through car window, laptop and Pi3B (powered w/ 2.1A from cigarette USB connector) connected wireless to Android smartphone access point, uv4l displayserver allowed me to see HDMI video preview window on laptop). I have not measured, but it feels u4vl streaming camera to have much less latency than the <0.1s .h264 in the browser solution. If uv4l being closed source is no problem for you, you should measure its wireless latency.
⇨https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://gitlab.freedesktop.org/HermannSW/gst-template
https://github.com/Hermann-SW/fork-raspiraw
https://twitter.com/HermannSW

Mauronic
Posts: 3
Joined: Thu May 02, 2019 10:46 pm

Re: open discussion on the camera latency

Tue Jun 18, 2019 5:16 pm

HermannSW wrote:
Tue Jun 18, 2019 4:14 pm

This project allows for <0.1s latency:
https://github.com/131/h264-live-player
Hi HermannSW,

How is that being measured? I didn't see any claims / tests on the project site, it would be great to get more info.

It seems hard to believe because according to the post above, up to 61ms of latency is incurred before the video is even encoded , transmitted, decoded and displayed.

According to this, h.264 encoding could take up to 40ms:

viewtopic.php?t=209706

pete

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7008
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: open discussion on the camera latency

Tue Jun 18, 2019 5:22 pm

Please read the docs on how the camera subsystem and rolling shutter sensors work - https://picamera.readthedocs.io/en/latest/fov.html
Mauronic wrote:
Tue Jun 18, 2019 3:52 pm
It looks like the exposure and readout takes a significant amount of time. I am not 100% sure why.
Physics.
You have to take some time to sample the photons received to build an image, and then transfer that data to the SoC.

If you have LOTS of light, then exposure time can be down at a few milliseconds.
Sampling the sensor array is then dictated by the speed of the ADC in the sensor, and the rate the data is sent over the CSI2 link. This is dictated by the configuration of the sensor for the link frequency, and the number of pixels to send. Frame rate is controlled by adding extra dummy lines to the end of the frame.
Mauronic wrote:Is there a camera / interface option that will allow immediate readout so we get each line as scanned without waiting for the exposure to finish? Or is this how it works now?
Physically impossible. How can you read out a line that isn't exposed? It'll be black.
Mauronic wrote:It was suggested to me to consider a very fast (and inefficient) encoding like MJPEG to get the data over WiFi ASAP. In this case I would budget a max bandwidth of 4 - 10MBps.
The recommendation for 1080P would be at least 10Mbit/s for an efficient codec like H264. At ~VGA then you really still want > 1Mbit/s on H264. The latency difference between MJPEG and H264 really isn't huge.

Clone userland, edit RaspiVid.c, and apply the diffs

Code: Select all

diff --git a/host_applications/linux/apps/raspicam/RaspiVid.c b/host_applications/linux/apps/raspicam/RaspiVid.c
index 2a9e586..7dac347 100644
--- a/host_applications/linux/apps/raspicam/RaspiVid.c
+++ b/host_applications/linux/apps/raspicam/RaspiVid.c
@@ -1207,6 +1207,7 @@ static void encoder_buffer_callback(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buf
    MMAL_BUFFER_HEADER_T *new_buffer;
    static int64_t base_time =  -1;
    static int64_t last_second = -1;
+   int64_t stc;
 
    // All our segment times based on the receipt of the first encoder callback
    if (base_time == -1)
@@ -1221,6 +1222,9 @@ static void encoder_buffer_callback(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buf
       int bytes_written = buffer->length;
       int64_t current_time = get_microseconds64()/1000;
 
+      mmal_port_parameter_get_int64(port, MMAL_PARAMETER_SYSTEM_TIME, &stc);
+      vcos_log_error("Latency is %lld usecs", stc-buffer->pts);
+
       vcos_assert(pData->file_handle);
       if(pData->pstate->inlineMotionVectors) vcos_assert(pData->imv_file_handle);
 
Rebuild userland, and raspivid will print out the difference in timestamp for when the first pixel of the frame was received to producing the encoded output buffer.
Running it at 640x480, 1Mbit/s, by default both H264 and MJPEG are giving 36-38ms latency.
Increase the frame rate to 41 so that the high framerate sensor mode is selected and H264 gives 15-16ms vs 14-15ms for MJPEG. Losing a millisecond for the gain in compression efficiency is almost certainly worth it.
Mauronic wrote:I did some research on this and found a company called Arducam that sells standalone camera modules and various camera boards. Could this hardware be leveraged in some way?
No, they're still rolling shutter sensors. From a system architecture perspective they are nearly identical to how the Pi camera works.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

User avatar
HermannSW
Posts: 1308
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: open discussion on the camera latency

Tue Jun 18, 2019 8:42 pm

6by9 wrote:
Tue Jun 18, 2019 5:22 pm
If you have LOTS of light, then exposure time can be down at a few milliseconds.
Off topic, but if you really have lots of light, then exposure time can be down to a few microseconds.
Below frame is lighted with 2×5000lm leds, and multiple exposures of 8.33µs strobe pulse length at 9kHz PWM frequency.
Each exposure captures the flying with 96.9m/s airgun pellet at a different position, so it is a 9000eps frame (exposures per second).
But in the end, v1 camera sensor needs to transfer the frame to Pi and that takes the normal huge transfer time:
https://github.com/Hermann-SW/Raspberry ... nt-9000eps
Image
⇨https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://gitlab.freedesktop.org/HermannSW/gst-template
https://github.com/Hermann-SW/fork-raspiraw
https://twitter.com/HermannSW

Return to “Camera board”