User avatar
Realizator
Posts: 31
Joined: Thu Jul 14, 2016 12:53 pm
Contact: Website Twitter

Re: Stereoscopic camera capture (for COMPUTE MODULE) - now implemented (2014).

Wed Mar 06, 2019 11:20 am

6by9 wrote:
Wed Mar 06, 2019 11:01 am
...
It's a fudge rather than a fix. It just removes all the signalling so video_render has no information to go on for stereoscopic, and hence interprets it as a single plane (quite probably squashed by decimation).
...
I'll see if there is an easy way to check whether the display is actually stereoscopic from within video_render. Switching to I420 has a fairly significant performance hit on the GPU.
That's why I decided to ask you. ;)

pculverhouse
Posts: 3
Joined: Thu Jun 27, 2019 9:45 am

Re: Stereoscopic camera capture (for COMPUTE MODULE) - now implemented (2014).

Thu Jun 27, 2019 9:59 am

Hi,

I am trying to do something non-standard with stereo images from the dual camera computer board using a Pi3.

I want to capture a stereo pair (at reduced resolution - probably 640x480), and pass the memory buffer to my built-in IP socket, so that I can send stereo pairs on demand to a host connected to the other end of the IP socket.

So, I don't want to use GL rendering and I don't want to pass the file handle to the MMAL so that I can save the pair to file.

I am struggling with raspistill.c to hack out the bits I need to do the above. BUt I get completely un-stuck when it comes to MMAL interface.

Any pointers most gratefully received.

Regards Phil

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7124
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Stereoscopic camera capture (for COMPUTE MODULE) - now implemented (2014).

Thu Jun 27, 2019 11:12 am

You'd have been better off starting a new thread, but never mind.
pculverhouse wrote:
Thu Jun 27, 2019 9:59 am
I am trying to do something non-standard with stereo images from the dual camera computer board using a Pi3.

I want to capture a stereo pair (at reduced resolution - probably 640x480), and pass the memory buffer to my built-in IP socket, so that I can send stereo pairs on demand to a host connected to the other end of the IP socket.
Built in to what?
pculverhouse wrote:So, I don't want to use GL rendering and I don't want to pass the file handle to the MMAL so that I can save the pair to file.

I am struggling with raspistill.c to hack out the bits I need to do the above. BUt I get completely un-stuck when it comes to MMAL interface.
raspistill takes still images and encodes them to JPEG (or similar), but it sounds like you want a stream.
raspivid would encode to H264 or MJPEG.
raspividyuv will deliver you the raw YUV or RGB frames.

Two options:
- Modify the source code of raspividyuv to write to your socket. camera_buffer_callback is where you get the buffer, with buffer->data being the pointer to buffer->length bytes of image data.
- Use "raspividyuv -o -" and it writes to stdout. Pipe stdout into whatever program you're using to control your socket.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

pculverhouse
Posts: 3
Joined: Thu Jun 27, 2019 9:45 am

Re: Stereoscopic camera capture (for COMPUTE MODULE) - now implemented (2014).

Fri Jun 28, 2019 7:42 am

Thanks.

The machine is the Plymouth robot OWL. (See https://web-dr.tis.plymouth.ac.uk/resea ... ymouth-owl). Currently we use Linuxprojects http IP server to stream video encoded as MJPEG to an OpenCV application sitting on the host. But MJPEG has no buffer flowcontrol. So when the host is taking its time to process a stereo pair (disparity calculation for example) the MJPEG buffer continues to fill. A solution is to empty the buffer in the processing loop. But that causes time delays between sampling the video stream and the OWL taking the original images - the OWL is an agile camera system so grabbing a stereo pair, processing it and moving the cameras is the loop.

So, a solution is to take images on demand, in our own IP socket connection to the host. So the host issues a grab command, and one stereo pair is sent to the host by the Pi3. I think I will leave the camera stream open but issue grabs when I need.

I will try the camera_buffer_callback method, but you suggest using raspistillyuv.c, I'll have to understand how to set that for stereo. I did a quick search of the source and could not find any options.

thanks!

Phil

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7124
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Stereoscopic camera capture (for COMPUTE MODULE) - now implemented (2014).

Fri Jun 28, 2019 9:22 am

pculverhouse wrote:
Fri Jun 28, 2019 7:42 am
The machine is the Plymouth robot OWL. (See https://web-dr.tis.plymouth.ac.uk/resea ... ymouth-owl). Currently we use Linuxprojects http IP server to stream video encoded as MJPEG to an OpenCV application sitting on the host. But MJPEG has no buffer flowcontrol. So when the host is taking its time to process a stereo pair (disparity calculation for example) the MJPEG buffer continues to fill. A solution is to empty the buffer in the processing loop. But that causes time delays between sampling the video stream and the OWL taking the original images - the OWL is an agile camera system so grabbing a stereo pair, processing it and moving the cameras is the loop.
Do you need the MJPEG encoding to reduce the network bandwidth?
MJPEG has no intra-frame dependencies, so grab the frames and drop them if you don't want them.
I will try the camera_buffer_callback method, but you suggest using raspistillyuv.c, I'll have to understand how to set that for stereo. I did a quick search of the source and could not find any options.
raspiVIDyuv. raspistill and raspistillyuv will take still images at the specified time, but that involves mode switches and other things that take time.

If you use raspivid/raspividyuv, the camera will be running continuously delivering frames. Make the choice in your application as to whether you wish to forward that frame on to your network socket or not.

Stereoscopic support for all 4 raspicam apps should be identical - it's in RaspiCamControl.c rather than the individual apps.
Then again I think there may be the setup lines missing in create_camera_component

Code: Select all

   status = raspicamcontrol_set_stereo_mode(camera->output[0], &state->camera_parameters.stereo_mode);
   status += raspicamcontrol_set_stereo_mode(camera->output[1], &state->camera_parameters.stereo_mode);
   status += raspicamcontrol_set_stereo_mode(camera->output[2], &state->camera_parameters.stereo_mode);

   if (status != MMAL_SUCCESS)
   {
      vcos_log_error("Could not set stereo mode : error %d", status);
      goto error;
   }
as are present in RaspiVid.c
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

pculverhouse
Posts: 3
Joined: Thu Jun 27, 2019 9:45 am

Re: Stereoscopic camera capture (for COMPUTE MODULE) - now implemented (2014).

Fri Jun 28, 2019 1:31 pm

Thanks for that introspection on the processes.

All I know is that we experience a lag, which we think is due to the MJPEG encoding and the streaming by the Apache web server. Our thinking is that we can remove some of those latencies by doing it ourselves. The advantage of synchronous capture and host use is that we know absolutely that the camera stereo pair were captured shortly before we start processing the images. The problem is that we are using OpenCV and the camera interface does not give us a MJPEG buffer flush, we have to loop to remove all frames in the buffer. I consider the reason is that MJPEG is designed to provide a resilient video interface despite variable network delays. This is not what we need.

Some thinking is required. I'll add those missing setup lines.

Thanks for your support.

Phil

User avatar
Realizator
Posts: 31
Joined: Thu Jul 14, 2016 12:53 pm
Contact: Website Twitter

Re: Stereoscopic camera capture (for COMPUTE MODULE) - now implemented (2014).

Tue Sep 10, 2019 6:27 am

@6by9, looks like stereoscopic support has been broken in the latest kernels update.
If it take Buster image (Raspbian Buster with desktop, 2019-07-10, kernel 4.19.57), both raspistill and raspivid works with -3d option. I mean that after running these commands I have one stereoscopic image and one stereoscopic video with appropriate names:

Code: Select all

raspistill -3d sbs -o 1.jpg
raspivid -3d sbs -w 1280 -h 720 -o 1.h264
Before all these tests I just put our dt-blob.bin to /BOOT and enable camera in raspi-config.

If I do system upgrade with piwiz (and kernel is updated to 4.19.66), both raspistill and raspivid stops working. Both raspistill and raspivid has the same behavior in this case:
- preview is started
- preview image freezes
- adding '-n' option (no preview) has no any effect
- no image or video saved
- system can not be shut down correctly (sudo poweroff), and frozen preview image is always on the screen

I've also tested stock Buster with rpi-update (and updated kernel is 4.19.71), but no luck too.
Could you please check this issue?

Return to “Camera board”