fastmapper wrote:I thought that the majority of Bayer image sensors supported both automatic gain control (AGC) and automatic white balance (AWB). Looking at Omnivision's current 5 megapixel offerings I find that the OV5640, OV5645, OV5647, OV5648, OV5650, and the OV5653 all support both AGC and AWB. The OV5656, OV5658, OV5680, and the OV5690 apparently do not. So of 10 parts (all Bayer image sensors), 6 (the majority) support those functions.
By YUV sensor I mean one that has the capability to read out YUV (or RGB). To do so you need to have a whole load of processing blocks.
Therefore of your list, OV5640 and OV5645 to me are both YUV sensors. Of your remaining sensors you have 4 with and 4 without - no overall majority.
Original request was for AptinaMT9J003 - no AGC or AWB (although you do get auto black level).
I have access to datasheets for lot of sensors at the office, but most are under NDA so I can't disclose them or details. Certainly most of Sony's and Toshiba Bayer sensors that I'm aware of don't support AGC or AWB.
Further Googling:
Aptina MT9P031
http://www.aptina.com/assets/downloadDocument.do?id=76 - nope.
Aptina MT9P011
http://www.aptina.com/assets/downloadDocument.do?id=177 - nope
Aptina MTP9012
http://www.aptina.com/assets/downloadDocument.do?id=178 - AGC(along with a scaler and other processing steps), but no AWB .
AGC and AWB both require the image to be processed by something to collect stats and then complete the control loop. It's up to the manufacturer as to whether they want to add those silicon blocks into their sensor - sometimes they do, sometimes they don't.
fastmapper wrote:Wait a minute! The OV5647 is used in the Raspberry Pi camera. That means it has a Bayer image sensor including built-in support for both AGC and AWB. Using those functions on the image sensor would offload the processing required by both the GPU and the ARM processor on the Raspberry Pi.[/qute]
Yes you could simplify the control algos if you choose an appropriate sensor and trust the module vendor's algo.
There's also the little issue of making the image usable that is required by most people - they don't want the Bayer pattern, so you've demosaicing and all the other image processing steps (lens shading, distortion correction, denoising, resizing, etc) to do somewhere. Where are you doing those steps? I doubt the hardware ISP is going to be opened up (it's too bl**dy complicated apart from anything else!).
fastmapper wrote:While there are numerous sensors for imaging and many provide color conversion to YUV, I'm unaware of any sensor that directly measures YUV. There are certainly billions of devices (e.g. cell phones, digital cameras, laptops, webcams, security cameras) that use CMOS image sensors with Bayer arrangements similar to the OV5647.
As above, by YUV sensor I mean one that can readout YUV. They all tend to sense as Bayer, but some have an inbuilt ISP so they actually produce YUV, or sometimes even JPEG. In doing so they will have AGC and AWB as it is much easier to collect the stat in the Bayer domain. They tend to max out around 5MPix as customers start getting fussy about image quality around there.
https://www.toshiba.com/taec/adinfo/cmos/ TCM3211PB, TCM3212GBA
Aptina MT9M131
http://www.1stvision.com/cameras/IDS/da ... 131_PB.pdf YCbCr, RGB565, RGB555, RGB444, raw Bayer, or processed Bayer.
Or your OV5640 or OV5645 that you quoted above.
I'm dropping off this thread now - Gordon has stated the Foundation's intent, and I guess people will encounter the intricacies of dealing with sensors as and when that happens. For me, I just turn up at the office and hit them every other day.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.