6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7274
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Jul 25, 2018 10:34 am

@luiscgalo: branching the conversation off to a new thread, as we've now got three interleaved conversations going on. I'll respond there.
viewtopic.php?f=43&t=218928
(I'll be doing the same thing with grimepoch's thread in a moment).
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7274
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Jul 25, 2018 10:56 am

@grimepoch: As with luiscgalo, branching the conversation off to a new thread.
viewtopic.php?f=43&t=218933
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7274
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Jul 25, 2018 11:57 am

General comment, if you're looking at interfacing a new chip/sensor/FPGA/thingamebob in, please start a new thread and post a link on this thread.
This thread is already 20 pages long, and is likely to grow in a fairly uncontrolled manner if people try adding all sorts of random support requests on it.
Thank you.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

Edeard95
Posts: 3
Joined: Tue Jul 17, 2018 4:07 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Jul 30, 2018 4:37 pm

Question relating to the process of adding sensor compatibility:

viewtopic.php?f=43&t=219263

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7274
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Aug 20, 2018 5:07 pm

There appears to be a further issue with rawcam in the latest firmwares (since the July 30th release) where it is even more prone to getting buffers out of step. I am investigating.

If people want to play, then I have done a significant hack around with raspiraw to optionally run a very simple grey world AWB algorithm. See https://github.com/6by9/raspiraw/tree/temp_awb. It could do with some rough edges being cleaned up (almost no error handling!), but I'm tempted to merge it back into master anyway.
That branch also includes a fix for IMX219 having H & V flips swapped. I'm slightly surprised that hadn't been picked up before (or apologies if it had been reported and I totally missed it).
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

VinoE
Posts: 2
Joined: Tue Aug 21, 2018 12:08 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Aug 23, 2018 10:04 am

Hello,

I'm just starting in RB Pi world and I would like some help with a project related to this thread's work. I have a Camera which sends its data (Pan or Bayer filtered) via 1 LVDS channel at 120MHz, it also has a Byte Strobe differential signal and the Clock.

I have a RB Pi Zero and I don't know if I can use its Camera connector and CSI 2 or if I should use the GPIO. In either case, besides the LVDS to Sub LVDS or 3v3CMOS conversion, what other aspects should I consider to achieve the speed and communication I want?

Thank you for your help,

Eddy

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7274
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Aug 23, 2018 10:16 am

VinoE wrote:
Thu Aug 23, 2018 10:04 am
I'm just starting in RB Pi world and I would like some help with a project related to this thread's work. I have a Camera which sends its data (Pan or Bayer filtered) via 1 LVDS channel at 120MHz, it also has a Byte Strobe differential signal and the Clock.

I have a RB Pi Zero and I don't know if I can use its Camera connector and CSI 2 or if I should use the GPIO. In either case, besides the LVDS to Sub LVDS or 3v3CMOS conversion, what other aspects should I consider to achieve the speed and communication I want?
Sorry, the Pi doesn't support LVDS, and GPIOs will be too slow.

There are bridge chips around that will convert from various interfaces to MIPI CSI2. A quick Google would provide you with hits such as
http://www.latticesemi.com/Products/Des ... ridge.aspx (FPGA solution, and Sony sub-LVDS to CSI2)
or https://e2e.ti.com/support/interface/hi ... 8/t/614881 suggesting the TI DS90UH947-Q1 feeding a DS90UH940-Q1,
or http://www.lontiumsemi.com/uploadfiles/ ... _Brief.pdf

You'll need to do your own research to find a conversion solution that matches your use case.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7274
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Aug 23, 2018 10:46 am

For those seing issues with rawcam, I've put a new firmware at https://drive.google.com/file/d/1pfj5OY ... sp=sharing that I believe fixes the problem. Please test and report back. It should get merged into the main firmware releases in the next few days anyway.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

VinoE
Posts: 2
Joined: Tue Aug 21, 2018 12:08 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Aug 23, 2018 12:10 pm

6by9 wrote:
Thu Aug 23, 2018 10:16 am
VinoE wrote:
Thu Aug 23, 2018 10:04 am
I'm just starting in RB Pi world and I would like some help with a project related to this thread's work. I have a Camera which sends its data (Pan or Bayer filtered) via 1 LVDS channel at 120MHz, it also has a Byte Strobe differential signal and the Clock.

I have a RB Pi Zero and I don't know if I can use its Camera connector and CSI 2 or if I should use the GPIO. In either case, besides the LVDS to Sub LVDS or 3v3CMOS conversion, what other aspects should I consider to achieve the speed and communication I want?
Sorry, the Pi doesn't support LVDS, and GPIOs will be too slow.

There are bridge chips around that will convert from various interfaces to MIPI CSI2. A quick Google would provide you with hits such as
http://www.latticesemi.com/Products/Des ... ridge.aspx (FPGA solution, and Sony sub-LVDS to CSI2)
or https://e2e.ti.com/support/interface/hi ... 8/t/614881 suggesting the TI DS90UH947-Q1 feeding a DS90UH940-Q1,
or http://www.lontiumsemi.com/uploadfiles/ ... _Brief.pdf

You'll need to do your own research to find a conversion solution that matches your use case.
Thanks for your repply 6by9. One last question, after the conversion to CSI2, what do I need to do in order to access these port in the Raspberry? As far as I read, this port only supports the Raspberry Camera Module.

Thank you,

Eddy

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7274
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Aug 23, 2018 12:21 pm

VinoE wrote:
Thu Aug 23, 2018 12:10 pm
Thanks for your repply 6by9. One last question, after the conversion to CSI2, what do I need to do in order to access these port in the Raspberry? As far as I read, this port only supports the Raspberry Camera Module.
You have two choices, both of them requiring some effort from you.

First is that there is a V4L2 driver for the CSI2 peripheral. You need to connect it to a sensor driver, and it then delivers frames back via V4L2.
This is the preferred route.

Second there is the firmware based driver being discussed on this thread - a MMAL component called rawcam.
raspiraw is an example app that uses that component. All setup of the sensor is down to the application. raspiraw has example register commands to configure either version of the Pi camera (and an analogue video chip). You need to add the relevant config to set up your sensor.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

User avatar
HermannSW
Posts: 1489
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Oct 05, 2018 9:55 pm

I did not do much high framerate capturing lately. Only simple 180fps raspivid videos of payload dropped from drone.

I got asked whether Pi camera can do [email protected], the 640xH high framerate modes were not sufficient horizontal resolution for scanner camera application.

I did test with v2 camera today, started with 3280x2464 tool and then: half vertical resolution, double framerate, repeat.
I ended up at 998fps for 3280x32, but a frame skip rate that I did not like.
I reduced to framerate with <1% frame skips, and that was 720fps.
I did same for 3280x64 and ended up with 500fps.

Summary:
v2 camera can do [email protected] and [email protected] with frame skip rate less than 1% (tools attached).
Similar to be expected for v1 camera (a little more than half the corresponding v2 framerate).
Attachments
3280fast.zip
(1.19 KiB) Downloaded 90 times
⇨https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://gitlab.freedesktop.org/HermannSW/gst-template
https://github.com/Hermann-SW/fork-raspiraw
https://twitter.com/HermannSW

Mike_green
Posts: 8
Joined: Wed Oct 17, 2018 6:36 am

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Oct 22, 2018 9:00 am

Hi there! Here is our post about AR0144 sensor. Could you give us some tips?
viewtopic.php?f=43&t=225272

User avatar
HermannSW
Posts: 1489
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Raw sensor access / CSI-2 receiver peripheral

Sat Jan 05, 2019 6:22 pm

I saw that you got help on the sensor from 6by9 in the other thread.

Just wanted to report that for line scanner type applications v2 camera recording based on mode1 (1920x1080) can capture [email protected] with 1% frame skip rate. I looked into this only because mode1 is one of the no binning modes. Similar to the 640xH modes being capped at 1007fps, 1920xH modes are capped at 998fps. In the middle of the image you can see 2 of the 6 Arduino Due ISP pins, left and right from the the female headers:
Image
Attachments
1920x39.zip
(616 Bytes) Downloaded 51 times
⇨https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://gitlab.freedesktop.org/HermannSW/gst-template
https://github.com/Hermann-SW/fork-raspiraw
https://twitter.com/HermannSW

rcasiodu
Posts: 8
Joined: Tue May 28, 2019 2:28 am

Re: Raw sensor access / CSI-2 receiver peripheral

Thu May 30, 2019 9:22 am

6by9 wrote:
Thu Aug 23, 2018 12:21 pm
VinoE wrote:
Thu Aug 23, 2018 12:10 pm
Thanks for your repply 6by9. One last question, after the conversion to CSI2, what do I need to do in order to access these port in the Raspberry? As far as I read, this port only supports the Raspberry Camera Module.
You have two choices, both of them requiring some effort from you.

First is that there is a V4L2 driver for the CSI2 peripheral. You need to connect it to a sensor driver, and it then delivers frames back via V4L2.
This is the preferred route.

Second there is the firmware based driver being discussed on this thread - a MMAL component called rawcam.
raspiraw is an example app that uses that component. All setup of the sensor is down to the application. raspiraw has example register commands to configure either version of the Pi camera (and an analogue video chip). You need to add the relevant config to set up your sensor.
Hi 6by9, what's the main different of the two methods you mentioned?
In the first method, can you give me an example of drive file? Could I just modify the ov5647.c(/drivers/media/i2c/ov5647.c) to other sensor? How to use the v4l2 driver to get raw data from sensor? Is it possible to transfer raw10/raw12 to yuv data?(use arm core?)

Do the "vc.ril.isp" use the same pipeline of the isp in bcm283x for officially supported camera(ov5647 & imx219)? Or it just use the GPU core of bcm283x through MMAL?

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7274
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Thu May 30, 2019 9:46 am

rcasiodu wrote:
Thu May 30, 2019 9:22 am
Hi 6by9, what's the main different of the two methods you mentioned?
The main ones:
- Licence. Any kernel driver almost has to be GPLv2 and therefore open source. Check with your sensor supplier that they are happy for the register set to be released. Copying raspiraw means that the register set can be hidden in your userspace app which can be under any licence you fancy (including closed). It's all a little silly as it is trivial to put an I2C analyser on the lines to the camera module, run your app, and capture the relevant commands.
- Framework. V4L2 is the Linux standard APIs and is therefore portable to other platforms. MMAL is specific to the Pi.
- Support. I'm less inclined to fix any issues discovered in rawcam than in V4L2. V4L2 is seen as the way forward.
rcasiodu wrote:In the first method, can you give me an example of drive file? Could I just modify the ov5647.c(/drivers/media/i2c/ov5647.c) to other sensor?
/drivers/media/i2c/ov5647.c is a relatively basic driver, but would work. imx258.c is a little more comprehensive, but lacks the dt/fwnode configuration and regulator control (not essential if you set the power control GPIO some other way). I believe ov5640.c is a reasonable example of that.
rcasiodu wrote:How to use the v4l2 driver to get raw data from sensor? Is it possible to transfer raw10/raw12 to yuv data?(use arm core?)
You use the V4L2 API to get raw frames out.
"v4l2-ctl --stream--map=3 --stream-to=foo.raw --stream-count=100" would be a simple existing tool.
https://linuxtv.org/downloads/v4l-dvb-a ... ure.c.html is the standard example for grabbing frames. Do what you want in process_image(). You want to be using the MMAP method, not read (userptr isn't supported).

The driver delivers the raw data, so that is whatever the sensor produces. You could try converting Bayer to YUV on the ARM, but you get a LOT of data, and the image processing required is not exactly lightweight.
Raw10/12 are a pain to unpack.There is an option to expand the raw10/12 to 16bpp (lowest bits used), but it's not exposed at the moment. unicam_set_packing_config does the relevant config, but currently mbus_depth will always equal v4l2_depth.

You can link V4L2 with MMAL. My fork of yavta (Yet Another V4L2 Test Application - https://github.com/6by9/yavta) does that, and uses dmabufs so there is no copying required of the data between the two subsystems.
Similarly in the 4.19 kernel there is now a V4L2 wrapper around the ISP which recent versions of GStreamer should talk to as v4l2videoconvert.

There is some work going on with http://libcamera.org/ to produce a standardised Linux framework for complex camera subsystems too.
rcasiodu wrote:Do the "vc.ril.isp" use the same pipeline of the isp in bcm283x for officially supported camera(ov5647 & imx219)? Or it just use the GPU core of bcm283x through MMAL?
Yes, it is the same hardware block, but with none of the control loops (AE/AGC/AWB/Len shading) running on the data.
On my list of tasks to support the libcamera stuff is to expose the statistics from the ISP so that those algorithms can be run more easily from userspace.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

rcasiodu
Posts: 8
Joined: Tue May 28, 2019 2:28 am

Re: Raw sensor access / CSI-2 receiver peripheral

Thu May 30, 2019 12:18 pm

Thanks 6by9, your answer are very helpfull. Now i understand the different and get more confidence about what i'm trying to do.

bhjel
Posts: 17
Joined: Tue Jan 01, 2019 2:00 am

Re: Raw sensor access / CSI-2 receiver peripheral

Sat Jun 01, 2019 10:39 am

Does GPU firmware limit the V1 cam to 90fps at 320x240 despite ov5647 sensor supporting 120 at 320x240? I have modified the kernel module to bump the cap to 120, but I can only achieve 120fps with the v2 camera.

User avatar
HermannSW
Posts: 1489
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Raw sensor access / CSI-2 receiver peripheral

Sat Jun 01, 2019 9:12 pm

bhjel wrote:
Sat Jun 01, 2019 10:39 am
Does GPU firmware limit the V1 cam to 90fps at 320x240 despite ov5647 sensor supporting 120 at 320x240?
Yes.
I have modified the kernel module to bump the cap to 120, but I can only achieve 120fps with the v2 camera.
What kernel module?
It is not raspivid open source userspace program that caps the framerate at 90fps, but closed source GPU code.

In my userland fork I did I2C command injection for framerates >90fps for both, v1 and v2 camera.
I proved that v2 camera can do up to 200fps videos and the cap at 120fps at that time was too conservative.
After that, 6by9 did bump maximal framerate for v2 camera mode 7 only to 200fps in closed source GPU.
Please see remarks below v2 camera mode table in documentation on framerates above 120fps:
https://www.raspberrypi.org/documentati ... /camera.md

Here you can see my code that allows for (arbitrary) overwrite of framerate, allowing to bypass GPU restriction:
https://github.com/Hermann-SW/userland/ ... id.c#L1682

v1 camera with framerate 90fps runs near bandwidth limit already for mode 7 :-(
You can get valid video for "-fps 95", wrong color video for "-fps 97" and you will get no video for framerate >97fps.


That is for mode 7. While there is no 320x240 mode, I made one for raspiraw and that showed that v1 camera can do [email protected] (even [email protected]), and raspiraw can even do [email protected] with v2 camera. That mode is not that interesting though as does not work above 202fps because of a known GPU bug with won't fix decision:
https://github.com/6by9/userland/issues/14


A 320x240 mode 8 would make sense for v1 camera only, since v2 can do [email protected] Because of making sense for v1 camera only for raspivid, I doubt that 6by9 would consider adding that mode 8 for raspivid(yuv)? and raspistill.

Here is v1 camera raspiraw tool "640x240" that does capture with 180fps:
https://github.com/6by9/raspiraw/blob/m ... ls/640x240
⇨https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://gitlab.freedesktop.org/HermannSW/gst-template
https://github.com/Hermann-SW/fork-raspiraw
https://twitter.com/HermannSW

bhjel
Posts: 17
Joined: Tue Jan 01, 2019 2:00 am

Re: Raw sensor access / CSI-2 receiver peripheral

Sat Jun 01, 2019 9:35 pm

Thank you for the detailed post. Currently studying your changes and 6by9's work. I am using the V4l2 driver for my application which has an additional 90fps cap that limited even the v2 sensor before my change.:

"#define FPS_MAX 90"
https://github.com/raspberrypi/linux/bl ... 5-camera.c

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 7274
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Jun 02, 2019 6:46 am

bhjel wrote:
Sat Jun 01, 2019 9:35 pm
Thank you for the detailed post. Currently studying your changes and 6by9's work. I am using the V4l2 driver for my application which has an additional 90fps cap that limited even the v2 sensor before my change.:

"#define FPS_MAX 90"
https://github.com/raspberrypi/linux/bl ... 5-camera.c
This is really the wrong thread to be discussing anything to do with the bcm2835-v4l2 driver as that is sitting on top of the full camera stack, not rawcam and manually programming register sets.

I have had a reminder open for a while to try and make the max frame rate from V4L2 vary based on sensor, but I haven't got around to fixing it. https://github.com/raspberrypi/linux/issues/1774

In theory a QVGA 180fps mode could be added for the V1 sensor (assuming a register set from Omnivision actually worked), but the use cases are so limited that it isn't a priority. We haven't sold the V1 sensor since 2015 and I don't recall any requests for it until now.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

User avatar
HermannSW
Posts: 1489
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Aug 07, 2019 3:08 pm

Robert Elder did a blog post some days ago:
"A Guide to Recording 660FPS Video On A $6 Raspberry Pi Camera"
http://blog.robertelder.org/recording-6 ... pi-camera/

There was an intensive discussion on hackernews on this:
https://news.ycombinator.com/item?id=20627574

He explains (in blog post as well as youtube videos) each step required, starting with fresh Raspbian buster SD card image until reaching at a working high framerate raspiraw environment. He makes use of raspiraw high framerate options as well as raspiraw.

He does create videos from the RAM captured frames in post processing with self written python script to keep frames at the timestamps captured in the video.

He does not use nor mentions tools under "tools/" directory for capturing, nor raw2ogg2anim for creating .ogg video and animated .gif from a subset of the frames captured at a specified target framerate.

He explains the lighting problems one typically sees when trying the first time and options on how to deal with them.

Film clips from Robert's two youtube videos, nice!
Image Image
⇨https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://gitlab.freedesktop.org/HermannSW/gst-template
https://github.com/Hermann-SW/fork-raspiraw
https://twitter.com/HermannSW

rayzrocket
Posts: 3
Joined: Tue Jul 30, 2019 4:57 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Aug 12, 2019 11:54 pm

Great work! Is it possible to acquire or read the frames into a numpy array using these aforementioned methods?
I tried using bytesio streams in multithreaded approach learned from Tobias Kuhn's Ping Pong Pi , he is brilliant! But it takes 70ms just to 'struct.unpack' a 640x480x3 color frame from memory. Also tried reading certain '>' threshold pixel locations to pick out objects of interest as Kuhn does, but this also took significant time for 640x480. But writing frames individually to .jpg files worked upto 78fps, verified by recording a rotating marked fan blade.
Do I have any chance of extending your work into less than 10ms frame capture into useful processible array? has it already been done?

User avatar
HermannSW
Posts: 1489
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Raw sensor access / CSI-2 receiver peripheral

Tue Aug 13, 2019 9:02 am

rayzrocket wrote:
Mon Aug 12, 2019 11:54 pm
Great work! Is it possible to acquire or read the frames into a numpy array using these aforementioned methods?
I doubt that numpy is fast enough to do any meaningful work in the low time available ([email protected] or [email protected] leaves you 1ms/1.5ms of processing per frame only).
But writing frames individually to .jpg files worked upto 78fps, verified by recording a rotating marked fan blade.
Do I have any chance of extending your work into less than 10ms frame capture into useful processible array? has it already been done?
You get the frames in a processible buffer, just 10 bits of raw Bayer data for each pixel. I did automatic camera tilt callibration code, see this posting for the details.As first step I created a 320x240 grey8 image from 640x480 frame captured. Then I did the image processing on the grey8 image and controlled the stepper motor:
https://www.raspberrypi.org/forums/view ... 1#p1231151

But I did that for [email protected] raspiraw of v1 camera -- today I would do that on [email protected] of raspividyuv output, that produces YUV images and looking at the 640x480 Y plane alone is already the grey8 image:
https://www.raspberrypi.org/forums/view ... v#p1479262

Processing for 90fps leaves you more time, 11.1ms per frame.

High framerate video capturing does not leave you the time, just storing into ramdisk and doing everything else in post processing is the way to go there.

P.S:
You can do frame processing in a gstreamer plugin as well, you can take my redfilter example as basis and start from there:
https://www.raspberrypi.org/forums/view ... 4#p1513568
Image
⇨https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://gitlab.freedesktop.org/HermannSW/gst-template
https://github.com/Hermann-SW/fork-raspiraw
https://twitter.com/HermannSW

rayzrocket
Posts: 3
Joined: Tue Jul 30, 2019 4:57 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Aug 14, 2019 5:20 am

Thank you Hermann ! I am a novice, please forgive me.
Yes, 640x480 at 90fps is great for me, and would like to apply custom image processing in frame rate time (realtime).

There are two fundamental approaches (maybe more):
1. Sequential: Capture Frame > Process > Capture Frame > ...
2. Parallel (I believe what you mention), requires multiple threads? :
Capture Frame > Process > Capture Frame....
...... Process > Capture Frame > Process ....

Much depends on actual 'capture' time vs. 'fps'. For example, a camera can provide 60fps but consumes 95% of the time between frame capture starts. Same camera set to 30fps consumes 48% of the time between frame captures allowing upto 15ms for processing.

My novice approach is getting 30fps including basic processing(6ms) using Logitech C920 usb webcam with CV2 in Python3.

Code: Select all

cam = cv2.VideoCapture(0)#use USB cam
ret_val, frame = cam.read()
I use numpy array of the RGB 640x480 to perform basic custom minima search algo to track a dark spot. I can convert algo to work in a flattened list of pixel values, which is what I am expecting to be available in the Gstreamer c code approach you mention, I need to investigate.

My goal is to do this at 90fps, I'll be happy with 60fps. And using a low cost cam such as the V2 instead of a USB camera.

I will study your gstream piping work. Can I access individual pixels using your gstreadfilter.c approach? is it just a list of bytes?

User avatar
HermannSW
Posts: 1489
Joined: Fri Jul 22, 2016 9:09 pm
Location: Eberbach, Germany
Contact: Website Twitter YouTube

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Aug 14, 2019 8:12 pm

rayzrocket wrote:
Wed Aug 14, 2019 5:20 am
There are two fundamental approaches (maybe more):
1. Sequential: Capture Frame > Process > Capture Frame > ...
2. Parallel (I believe what you mention), requires multiple threads? :
Capture Frame > Process > Capture Frame....
...... Process > Capture Frame > Process ....
Just raspividyuv capturing, and pipeing into gstreamer pipeline,so 2 processes.
Much depends on actual 'capture' time vs. 'fps'. For example, a camera can provide 60fps but consumes 95% of the time between frame capture starts. Same camera set to 30fps consumes 48% of the time between frame captures allowing upto 15ms for processing.

True.
I use numpy array of the RGB 640x480 to perform basic custom minima search algo to track a dark spot.
I found a two part article on writing a gstreamer plugin in python:
https://mathieuduponchelle.github.io/20 ... disclaimer
Writing GStreamer elements in python is usually a terrible idea:
  • Python is slow, actual data processing should be avoided at all cost, and instead delegated to C libraries such as numpy, which is exactly what we'll do in this part.
  • The infamous GIL enforces serialization, which means python elements will not be able to take advantage of the multithreading capabilities of modern platforms.
  • ...
The only valid reasons for ignoring these restrictions are, to the best of my knowledge:
  • Python is the only language you know how to use.
  • You want to use a python package that has no equivalent elsewhere, for example for scientific computing.
  • ...
This backs what I said, that you should write image processing real time stuff in C and not in python.


My goal is to do this at 90fps, I'll be happy with 60fps. And using a low cost cam such as the V2 instead of a USB camera.

I will study your gstream piping work. Can I access individual pixels using your gstreadfilter.c approach? is it just a list of bytes?
Yes, just a byte buffer:
https://gitlab.freedesktop.org/HermannS ... ter.c#L303
The gst_buffer "buf" is mapped to info structure, and info.data points to first byte of YUV data.
byte pointer y, u and v step through the frame.
Y is 640x480, U and V are 320x240 bytes (one U and one V value for 2x2 Y values).
y steps through Y, every even column/row, u and v step bytewise through the corresponding U and V locations:

Code: Select all

      gst_buffer_map (buf, &info, GST_MAP_READWRITE);
      assert(w*h*3 == info.size*2);

      U=info.data+w*h;
      for(i=0, y=info.data, u=U, v=U+w*h/4; y<U; i+=2, y+=2, ++u, ++v) {
        if (i==w) { i=0; y+=w; if (y==U) break; }

These lines compute red value for left-top, right-top, left-bottom and right-bottom pixel of 2x2 Y area after YUV2RGB transform:

Code: Select all

        rlt = YUV2R(y[0+0],*v);
        rrt = YUV2R(y[0+1],*v);
        rlb = YUV2R(y[w+0],*v);
        rrb = YUV2R(y[w+1],*v);

This is back compuration of new Y values for 2x2 area for RGB2YUV transform with R=r, B=0 and G=0:

Code: Select all

        y[0+0] = R2Y(rlt);
        y[0+1] = R2Y(rrt);
        y[w+0] = R2Y(rlb);
        y[w+1] = R2Y(rrb);

This is back computation of new U and V values for RGB2YUV transform, and after loop end buffer gets unmapped:

Code: Select all

        r = (rlt+rrt+rlb+rrb)/4.0;

        *u = R2U(r);
        *v = R2V(r);
      }
      assert(y==U);
      assert(u==U+w*h/4);
      assert(v==U+w*h/2);

      gst_buffer_unmap (buf, &info);

In this posting I computed the time for redfilter in milliseconds (14.3ms on Pi3A, at 31fps framerate):
https://www.raspberrypi.org/forums/view ... 6#p1519031

In case you want only to compute something and don't need video output, I replaced the part after redfilter by just "! fakesink":

Code: Select all

$ raspividyuv -t 0 -md 7 -w 640 -h 480 -fps 200 -n -o - | GST_PLUGIN_PATH=/usr/local/lib/gstreamer-1.0/ gst-launch-1.0 -v fdsrc ! queue ! "video/x-raw, width=640, height=480, format=I420, framerate=200/1" ! rawvideoparse use-sink-caps=true ! redfilter ! fakesink
With that you can do 200fps (maximal v2 camera raspivid[yuv] framerate),with at least 14.3ms computation per 640x480 YUV frame:

Code: Select all

...
640x480(I420) 3800000000 14,26ms
640x480(I420) 3805000000 14,31ms
640x480(I420) 3810000000 14,24ms
640x480(I420) 3815000000 14,26ms
640x480(I420) 3820000000 14,25ms
640x480(I420) 3825000000 14,31ms
^Cmmal: Aborting program
...

All this is done with raspividyuv, so does not belong into this (raspiraw) thread. For further discussion you should open a new thread.

P.S:
Average computation time per pixel is 14.3ms/(640*480)=46.6ns (or 65 clock cycles at 1.4GHz).

P.P.S:
I don't fully understand yet what we see, 200*14.3ms=2.86s, but gstreamer pipeline has CPU utilization of 115% only ...

Code: Select all

...
  PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND   
 1171 pi        20   0   42328  10124   5760 S 114,9  2,7   2:41.08 gst-laun+ 
 1170 pi        20   0   55564    864    740 S   9,9  0,2   0:14.29 raspivid+ 
...

P.P.P.S:
200fps was indeed wrong, added new real timestamp in addition allowed to determine that Pi3A+ with fakesink output allows for 54fps with 14.3ms (red)filter processing time:
https://www.raspberrypi.org/forums/view ... 3#p1519593

In total 54*14.3ms=772ms of gstreamer buffer processing per 1 second, or 77.2%.
⇨https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/Raspberry_v1_camera_global_external_shutter
https://gitlab.freedesktop.org/HermannSW/gst-template
https://github.com/Hermann-SW/fork-raspiraw
https://twitter.com/HermannSW

Return to “Camera board”