So various people have asked about supporting this or that random camera, or HDMI input. Those at Pi Towers have been investigating various options, but none of those have come to fruition yet and raised several IP issues (something I really don't want to get involved in!), or are impractical due to the effort involved in tuning the ISP for a new sensor.
I had a realisation that we could add a new MMAL (or IL if you really have to) component that just reads the data off the CSI2 bus and dumps it in the provided buffers. After a moderate amount of playing, I've got this working
Firstly, this should currently be considered alpha code - it's working, but there are quite a few things that are only partially implemented and/or not tested. If people have a chance to play with it and don't find too many major holes in it, then I'll get Dom to release it officially, but still a beta.
Secondly, this is ONLY providing access to the raw data. Opening up the Image Sensor Pipeline (ISP) is NOT an option. There are no real processing options for Bayer data within the GPU, so that may limit the sensors that are that useful with this.
Thirdly, all data that the Foundation has from Omnivision for the OV5647 is under NDA, therefore I can not discuss the details there.
So what have we got?
There's a test firmware on my github account (https://github.com/6by9/RPiTest/blob/ma ... tart_x.elf) that adds a new MMAL component ("vc.ril.rawcam"). It has one output port which will spit out the data received from the CSI-2 peripheral (known as Unicam). Please do a "sudo rpi-update" first as it is built from the same top of tree with my changes. DO NOT RISK A CRITICAL PI SYSTEM WITH THIS FIRMWARE.The required firmware changes are now in the official release - no need for special firmware. There's a modified userland (https: // github.com/6by9/userland/tree/rawcam) that includes the new header changes, and a new, very simple, app called raspiraw.There's a new app at https://github.com/6by9/raspiraw that is a standalone app rather than being a complete userland clone. It saves every 15th frame as rawXXXX.raw and runs for 30 seconds. The saved data is the same format as the raw on the end of the JPEG that you get from "raspistill -raw", though you need to hack dcraw to get it to recognise the data. The code demonstrates the basic use of the component and includes code to start/stop the OV5647 streaming in the full 5MPix mode. It does not include in the source the GPIO manipulations required to be able to address the sensor, but there is a script "camera_i2c" that uses wiringPi to do that (I started doing it within the app, but that then required running it as root, and I didn't like that). You do need to jump through the hoops to enable /dev/i2c-0 first (see the "Interfacing" forum, but it should just be adding "dtparam=i2c_vc=on" to /boot/config.txt, and "sudo modprobe i2c-dev").
The OV5647 register settings in that app are those captured and posted on viewtopic.php?f=43&t=47798&start=25#p748855
- I've made use of zero copy within MMAL. So the buffers are allocated from GPU memory, but mapped into ARM virtual address space. It should save a fair chunk of just copying stuff around, which could be quite a burden when doing 5MPix15 or similar. This requires a quick tweak to /lib/udev/rules.d/10-local-rpi.rules adding the line: SUBSYSTEM=="vc-sm", GROUP="video", MODE="0660".
What is this not doing?
- This is just reading the raw data out of the sensor. There is no AGC loop running, therefore you've got one fixed exposure time and analogue gain. Not going to be fixed as that is down to the individual sensor/image source.
- The handling of the sensor non-image data path is not tested. You will find that you always get a pair of buffers back with the same timestamp. One has the MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO flag set and should be the non-image data. I have not tested this at all as yet, and the length will always come through as the full buffer size at the moment.
- The hardware peripheral has quite a few nifty tricks up its sleeve, such as decompressing DPCM data, or repacking data to an alternate bit depth. This has not been tested, but the relevant enums are there.
- There are a bundle of timing registers and other setup values that the hardware takes. I haven't checked exactly what can and can't be divulged of the Broadcom hardware, so currently they are listed as parameters timing1-timing5, term1/term2, and cpi_timing1/cpi_timing-2. I need to discuss with others whether these can be renamed to something more useful.
- This hasn't been heavily tested. There is a bundle of extra logging on the VC side, so "sudo vcdbg log msg" should give a load of information if people hit problems.
Longer term I do hope to find time to integrate this into the V4L2 soc-camera framework so that people can hopefully use a wider variety of sensors, but that is a longer term aim. The code for talking to MMAL from the kernel is already there in the bcm2835-v4l2 driver, and the demo code for the new component is linked here, so it doesn't have to be me who does that.
I think that just about covers it all for now. Please do report back if you play with this - hopefully it'll be useful to a fair few people, so I do want to improve it where needed.
Thanks to jbeale for following my daft requests and hooking an I2C analyser to the camera I2C, as that means I'm not breaking NDAs.
- The official CSI-2 spec is only available to MIPI Alliance members, but there is a copy of a spec on http://electronix.ru/forum/index.php?ac ... t&id=67362 which should give the gist of how it works. If you really start playing, then you'll have to understand how the image ID scheme works, image data packing, and the like.
- OV5647 docs - please don't ask us for them. There is a copy floating around on the net which Google will find you, and there are also discussions on the Freescale i.MX6 forums about writing a V4L2 driver for that platform, so information may be gleaned from there (https://community.freescale.com/thread/310786 and similar).
NB 1: As noted further down the thread, my scripts set up the GPIOs correctly for B+, and B+2 Pis (probably A+ too). If you are using an old A or B, please read lower to note the alternate GPIOs and I2C bus usage.
NB 2: This will NOT work with the new Pi display. The display also uses I2C-0 driven from the GPU, so adding in an ARM client of I2C-0 will cause issues. It may be possible to get the display to be recognised but not enable the touchscreen driver, but I haven't investigated the options there.