tudor wrote:Thank you a lot for your work. It's awesome to have access to the raw stream!
My plans are for a 10bit/pixel x264 encoding (i hope a 320x240 stream would work).
I'm not sure how you expect that to work. I'd expect x264 is going to want YUV data, not Bayer. Most of the spatial compression tricks it will want to pull just won't work on Bayer data as there isn't necessarily a correlation between the colour channels. The RAW10 packing is also very bizarre, so you might be better off repacking to RAW16.
tudor wrote:The instructions are ok, but it doesn't work for me with the official DSI screen.
It would be a good idea to warn potential users that this can be an issue.
This is why I2C-0 is normally considered to be controlled by the GPU and shouldn't be touched by the ARM! The display also uses it for detection and the touchscreen, so you've suddenly got two processors trying to access the same peripheral!
There is probably a way around this by specifying in config.txt that the display exists, but not loading the touchscreen driver. Needs a little bit of thought. There is no way to get the touchscreen working with this without doing hardware mods.
tudor wrote:It would be nice to also put the "old" methods so that when it doesn't work there aren't 10 different instructions to try.
(i'm thinking about all the ways of saying that the arm cpu gets to drive the camera i2c bus)
device-tree is nearly the only supported method - several people have expended a lot of effort upstreaming almost all the Pi drivers so that a stock kernel works, and I think those are going DT only.
As to the variation in which GPIOs and I2C bus to use, that could be coded into a table based on the Revision field from /proc/cpuinfo - there's a small exercise for the reader! The list of GPIOs/bus is all in https://github.com/raspberrypi/document ... t-blob.dts
, and http://elinux.org/RPi_HardwareHistory#B ... on_History
lists the board revisions. I can dig out the mapping if it isn't obvious.
A primitive way of evaluating auto-exposure would also help, the sensor datasheet is pretty long
But also those register settings don't set the sensor up to run AE - that is normally done on the GPU to provide more flexibility.
I can't provide any significant assistance on the register set due to NDAs, but there is a Linux OV5647 driver floating around the internet which may give you an alternate register set that may include AE running on the sensor.
Do ensure you set the frame height correctly, otherwise you may find you get no frames back as the line counter never hits your frame height before the CSI-2 stream signals end of frame (I can't remember if we trigger just on line count, or on frame end too).
This was a spare time project for me - I'm not employed to work on Pi. It isn't polished, and those out in the community are more than welcome to assist on any of the userspace bits. I will clean up my userland tree and do a Pull Request to get the rawcam stuff into the main repo - most people won't use it, but it just formalises things a bit more. I might even put together some basic docs. This is a not a project for novices to play with, so I have made lots of assumptions that those using it have some significant Linux experience and I don't think there is any real way around that.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.