6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 8117
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jan 24, 2020 12:47 pm

No, we don't program the NVM on the IMX219.
Note too that that is a programmed DPC approach. The ISP has an automatic DPC block that looks for significant discrepancies of a pixel compared to those of the same colour surrounding it, and compensates for it should it be above a threshold. That's the sort of approach that can't be done on the sensor as it requires multiple lines of context.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

devmonkey
Posts: 16
Joined: Tue Jul 05, 2016 7:38 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Tue Jan 28, 2020 10:06 am

I've just moved my project onto a PI zero and run into lack of CPU. I only use the red pixels and currently have to copy them out of the bayer array which accounts for about 25% of my workload. On the OV5467 do you think I could use TIMING_X/Y_INC registers to just get it to read out red pixels? The sensor CFA is arranged:

Code: Select all

  	0	1	2	3
0:	B	G	B	G
1:	G	R	G	R
2:	B	G	B	G
3:	G	R	G	R
Red pixels are on odd rows in odd columns. The TIMING_X/Y_INC use upper nibble for odd inc, lower nibble for even inc. So if I ask for a sensor window that starts on an odd row/col, e.g. [1,1] the first pixel will be a red pixel. To sample odd columns and rows from this point on what do you think of setting TIMING_Y_INC=TIMING_X_INC=0x21?

Lower nibble (0x1) being irrelevant since we never land on an even row/column. If I dropped the sensor down to RAW8 and made the window width a multiple of 32 (to avoid padding) then I would have a ready made RED pixel array.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 8117
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Tue Jan 28, 2020 10:42 am

devmonkey wrote:
Tue Jan 28, 2020 10:06 am
I've just moved my project onto a PI zero and run into lack of CPU. I only use the red pixels and currently have to copy them out of the bayer array which accounts for about 25% of my workload. On the OV5467 do you think I could use TIMING_X/Y_INC registers to just get it to read out red pixels? The sensor CFA is arranged:

Code: Select all

  	0	1	2	3
0:	B	G	B	G
1:	G	R	G	R
2:	B	G	B	G
3:	G	R	G	R
Red pixels are on odd rows in odd columns. The TIMING_X/Y_INC use upper nibble for odd inc, lower nibble for even inc. So if I ask for a sensor window that starts on an odd row/col, e.g. [1,1] the first pixel will be a red pixel. To sample odd columns and rows from this point on what do you think of setting TIMING_Y_INC=TIMING_X_INC=0x21?

Lower nibble (0x1) being irrelevant since we never land on an even row/column. If I dropped the sensor down to RAW8 and made the window width a multiple of 32 (to avoid padding) then I would have a ready made RED pixel array.
Sorry. I have to restate it again. https://github.com/6by9/raspiraw/blob/m ... odes.h#L30
// These register settings were as logged off the line
// by jbeale. There is a datasheet for OV5647 floating
// about on the internet, but the Pi Foundation/Trading have
// information from Omnivision under NDA, therefore
// we can not offer support on this.
// There is some information/discussion on the Freescale
// i.MX6 forums about supporting OV5647 on that board.
// There may be information available there that is of use.
//
// REQUESTS FOR SUPPORT ABOUT THESE REGISTER VALUES WILL
// BE IGNORED.
One thing you could do is to use the unpacking/repacking options of the receiver to convert to 16bit packing (data in the bottom 10 bits) rather than the weird SMIA 4 pixels in 5 bytes arrangement. That should save you a load of masking and bitshifting.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

devmonkey
Posts: 16
Joined: Tue Jul 05, 2016 7:38 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Tue Jan 28, 2020 3:20 pm

Ok no worries. I did a few tests and couldn't get the sensor to just send a single pixel per bayer quad, I don't think my (guessed) understanding TIMING_X/Y_INC is correct.

Anyway, good news I've solved my perf problem. My code rips out the red pixels from a 2592x480 rawcam bayer buffer into 1296x240 array, does a bunch of processing and then draws it and a load of stats onto the framebuffer 320x240, then fwrites that to /dev/fb1. It requests frames from rawcam at 50 fps.

I had a noticeable drop in the FPS I could process at after moving the code to a pi zero w. To write this code I did as others have and cloned raspiraw then changed it to my needs. I didn't change any of the options in the build script and having just looked GCC optimisations are off. Anyway bumping up the GCC optimisation fixed my performance problem, so anyone doing a lot of looping through pixel arrays using raspiraw based code and having an issue with speed check the -OX option in ./buildme.

I got 16 FPS with the default setting -O0 up to 48 FPS with -O2 or -Ofast, quite a massive improvement! This is on a PZW with wifi on and an ILI9341 SPI TFT 320x240 using the default raspbian fbtft device on fb1.

Here are the results of my particular app for different optimisation levels:

Code: Select all

-O0
FPS: 16.66
FPS: 16.67
FPS: 16.68
FPS: 16.68

-O1
FPS: 29.29
FPS: 33.76
FPS: 34.14
FPS: 29.97

-O2
FPS: 48.43
FPS: 47.85
FPS: 47.63

-O3
FPS: 45.84
FPS: 42.65
FPS: 45.45

-Ofast
FPS: 47.22
FPS: 48.44

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 8117
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Tue Jan 28, 2020 3:42 pm

devmonkey wrote:
Tue Jan 28, 2020 3:20 pm
Ok no worries. I did a few tests and couldn't get the sensor to just send a single pixel per bayer quad, I don't think my (guessed) understanding TIMING_X/Y_INC is correct.
Whilst I can't directly help, I will say that you need to alter other registers to define the readout size, not just set the _INC registers.
Modes 6 & 7 do use _INC to implement pixel skipping, although with odd values rather than you even ones. As a test you could alter those and see if the sensor will actually stream normally before looking at amending the full res mode.
devmonkey wrote:I had a noticeable drop in the FPS I could process at after moving the code to a pi zero w. To write this code I did as others have and cloned raspiraw then changed it to my needs. I didn't change any of the options in the build script and having just looked GCC optimisations are off. Anyway bumping up the GCC optimisation fixed my performance problem, so anyone doing a lot of looping through pixel arrays using raspiraw based code and having an issue with speed check the -OX option in ./buildme.

I got 16 FPS with the default setting -O0 up to 48 FPS with -O2 or -Ofast, quite a massive improvement! This is on a PZW with wifi on and an ILI9341 SPI TFT 320x240 using the default raspbian fbtft device on fb1.
I use raspiraw for lower level debugging, therefore disabling optimisations means gdb can actually do something useful instead of reporting "optimised out" for all the variables you're interested in! Yes, enabling them will gain some performance (a fair amount from your report).
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

rego21
Posts: 30
Joined: Fri Feb 16, 2018 4:09 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Feb 09, 2020 11:34 am

lak4cyut wrote:
Wed Aug 21, 2019 9:30 am
6by9 wrote:
Thu May 30, 2019 9:46 am
The main ones:
- Licence. Any kernel driver almost has to be GPLv2 and therefore open source. Check with your sensor supplier that they are happy for the register set to be released. Copying raspiraw means that the register set can be hidden in your userspace app which can be under any licence you fancy (including closed). It's all a little silly as it is trivial to put an I2C analyser on the lines to the camera module, run your app, and capture the relevant commands.
- Framework. V4L2 is the Linux standard APIs and is therefore portable to other platforms. MMAL is specific to the Pi.
- Support. I'm less inclined to fix any issues discovered in rawcam than in V4L2. V4L2 is seen as the way forward.
rcasiodu wrote:In the first method, can you give me an example of drive file? Could I just modify the ov5647.c(/drivers/media/i2c/ov5647.c) to other sensor?
/drivers/media/i2c/ov5647.c is a relatively basic driver, but would work. imx258.c is a little more comprehensive, but lacks the dt/fwnode configuration and regulator control (not essential if you set the power control GPIO some other way). I believe ov5640.c is a reasonable example of that.
rcasiodu wrote:How to use the v4l2 driver to get raw data from sensor? Is it possible to transfer raw10/raw12 to yuv data?(use arm core?)
You use the V4L2 API to get raw frames out.
"v4l2-ctl --stream--map=3 --stream-to=foo.raw --stream-count=100" would be a simple existing tool.
https://linuxtv.org/downloads/v4l-dvb-a ... ure.c.html is the standard example for grabbing frames. Do what you want in process_image(). You want to be using the MMAP method, not read (userptr isn't supported).
Hi 6by9,
Thanks for your information.
I built a custom kernel to enable bcm2835-unicam and ov5647 kernel modules and add ov5647 dt overlay, try to capture camera image via V4L2 official interface. (Of course I bought a Raspberry Pi official camera module - OV5647 version.)
And use v4l2-ctl to capture image.

Code: Select all

v4l2-ctl --device /dev/video0 --stream-mmap --stream-to=frame.raw --stream-count=1
But the captured image is abnormal (overlay & distortion) (I put the image below).
Would you like to provide me some hint about how to debug this problem? Or do you know what caused it.
It drive me crazy...

Thanks for the help.

frame.jpg
frame2.jpg

Detail info:
Kernel base version: 4.19.64
Raspberry Pi: 3 model B v1.2
Camera Module: Raspbery Pi (OV5647) Rev 1.3
Hii,

Im also trying to enable bcm2835-unicam, can you explain me what are the required steps? I read there is only needed to change the DT.

Thank you!

Return to “Camera board”