I have a problem regarding the synchronized capture of images from the PiCam Module.
Goal:
Capture still images (single shot) and trigger each frame separately.
Sychronize each frame capture with a GPIO output signal (LED).
Time offsets between frames should be below the ms range.
Testing setup:
using RaspberryPi B+ 1.2,wiringpi lib,
V4L (http://www.jayrambhia.com/blog/capture-v4l2/),
RaspiCam C++ API (http://www.uco.es/investiga/grupos/ava/node/40),
raspberrypi/userland/MMAL libs (example code from RaspiStill), openCV
4 LEDs are connected to GPIOs, set as Outputs, triggered one after another.
fixed parameters: resolution 640x480, RGB (CV_8UC3)
varied parameters: shutter time, led delay (delay between LED_ON and frame capture, and LED_OFF/LED_ON)
retrieve delay, ( between frame capture and reading the frame from the framebuffer, depends on the implementation)
What I expected:
the frames are triggered shortly after the gpio output is set, each frame shows exactly one LED at maximum brightness.
What I noticed:
frames are not synchronized. the delay is important, a higher led delay results in a higher chance to get a frame with only one led lit.
This of course partly depends on my setup. However, sometimes images show multiple leds lit at the same time, questioning the blocking-characteristics of some function calls.
(frame capture and readout overlap, 2 frames combined to one image => magic light aka lit surface with all LEDs off
Also changing the retrieve delay, while keeping all other parameters constant, results in different frames (leds in different states, or completely off)
Summary:
So, basically, V4l, RaspiCam_Still_Cv and raspistill access the Picam in video stream mode.
So far, i found no implementation to really trigger only one still frame.
Instead a random frame is retrieved from an video stream as "still" image.
(otherwise the retrieve delay or multiple readout of a frame from the camera frame buffer would have no effect)
I looked into the mmal implementation of the raspistill demo app.
(I also tried to take code samples and rewrite it, got an ENOSPEC error, when enabling the mmal default camera component)
There is a capture-port defined for the default camera component, however, even this capture port (not the video output port) seems to take only one frame from an intern video stream.
By varying the parameters i eventually found values to almost synchronize the frames.
However, these depend on the used resolution and other factors, thus are likely to change randomly if other settings are used.
Any idea how to actually trigger only one distinct frame with the PiCam?
Is it even possible?
I could imagine, this might be interesting for various applications.
If someone managed to do this, please add a small code example.
thx for your help.
