(I'll correct my colleague)
The
Linux CSI2 driver allows you to run any sensor you like and write the raw frames into memory.
The sensor driver and all control of the sensor is up to you. imx 258, 274, 319, and 355 drivers all appear to be merged in
https://git.linuxtv.org/media_tree.git/ ... /media/i2c and may make reasonable examples for you to then adapt. There are a couple of others being pinged around the linux-media mailing list at the moment, but not imx334.
James is right that there is no access to things like auto exposure, or auto white balance. Again you are responsible for that.
You have got access to the ISP hardware via MMAL and the vc.ril.isp component. That will accept the majority of the formats that are expected (Bayer 8, 10, 12, 14, or 16 bit, YUYV family, RGB 565, 888, and 8888 (with alpha ignored)), and will process as required to produce YUV or RGB output.
Collecting statistics for the image is currently up to you.
There is an example app for how to pass the data around effiiciently in
yavta. It started as one of the standard V4L2 test apps, but has gained the relevant integration into MMAL. I've mainly been testing it with sources producing YUYV or RGB formats, so it hasn't got example white balance. I have done that in
raspiraw, so might see if I can transplant the code easily.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.