Page 1 of 1

How to use imx334 not imx219 work with pi?

Posted: Wed Oct 10, 2018 5:13 pm
by jumma
I want to use another camera sony imx334 with pi . but I don't know is it possible to do that?if yes,how ?pls !

Re: How to use imx334 not imx219 work with pi?

Posted: Wed Oct 10, 2018 7:36 pm
by jamesh
No, it's not possible without access to the source code for the GPU firmware, and that is closed source.

Re: How to use imx334 not imx219 work with pi?

Posted: Thu Oct 11, 2018 4:41 am
by jumma
why is that close? Can i get open source? Our project really need to change camera yet.

Re: How to use imx334 not imx219 work with pi?

Posted: Thu Oct 11, 2018 5:21 am
by fruitoftheloom
jumma wrote:
Thu Oct 11, 2018 4:41 am
why is that close? Can i get open source? Our project really need to change camera yet.

The Raspberry Pi SBC and Compute Module are not Open Source Hardware, that is a decision made by Broadcom and RPT have to abide by the Licensing Terms.

Re: How to use imx334 not imx219 work with pi?

Posted: Thu Oct 11, 2018 8:27 am
by jamesh
jumma wrote:
Thu Oct 11, 2018 4:41 am
why is that close? Can i get open source? Our project really need to change camera yet.
Sorry, the only supported camera for the CSI port are the OV5647 and the IMX219.

There is an option to drive the CSI port from Linux and you would be able to get raw bayer frames from the camera, but at this point in time (IIRC), you would not be able to them pass those through the ISP, so you would have to process the raw data yourself on the ARM. This won't be as fast as using the HW, and writing a software ISP is not a trivial task.

Re: How to use imx334 not imx219 work with pi?

Posted: Thu Oct 11, 2018 10:54 am
by 6by9
(I'll correct my colleague)
The Linux CSI2 driver allows you to run any sensor you like and write the raw frames into memory.
The sensor driver and all control of the sensor is up to you. imx 258, 274, 319, and 355 drivers all appear to be merged in https://git.linuxtv.org/media_tree.git/ ... /media/i2c and may make reasonable examples for you to then adapt. There are a couple of others being pinged around the linux-media mailing list at the moment, but not imx334.

James is right that there is no access to things like auto exposure, or auto white balance. Again you are responsible for that.

You have got access to the ISP hardware via MMAL and the vc.ril.isp component. That will accept the majority of the formats that are expected (Bayer 8, 10, 12, 14, or 16 bit, YUYV family, RGB 565, 888, and 8888 (with alpha ignored)), and will process as required to produce YUV or RGB output.
Collecting statistics for the image is currently up to you.

There is an example app for how to pass the data around effiiciently in yavta. It started as one of the standard V4L2 test apps, but has gained the relevant integration into MMAL. I've mainly been testing it with sources producing YUYV or RGB formats, so it hasn't got example white balance. I have done that in raspiraw, so might see if I can transplant the code easily.

Re: How to use imx334 not imx219 work with pi?

Posted: Thu Oct 11, 2018 11:03 am
by jamesh
6by9 wrote:
Thu Oct 11, 2018 10:54 am
(I'll correct my colleague)
The Linux CSI2 driver allows you to run any sensor you like and write the raw frames into memory.
The sensor driver and all control of the sensor is up to you. imx 258, 274, 319, and 355 drivers all appear to be merged in https://git.linuxtv.org/media_tree.git/ ... /media/i2c and may make reasonable examples for you to then adapt. There are a couple of others being pinged around the linux-media mailing list at the moment, but not imx334.

James is right that there is no access to things like auto exposure, or auto white balance. Again you are responsible for that.

You have got access to the ISP hardware via MMAL and the vc.ril.isp component. That will accept the majority of the formats that are expected (Bayer 8, 10, 12, 14, or 16 bit, YUYV family, RGB 565, 888, and 8888 (with alpha ignored)), and will process as required to produce YUV or RGB output.
Collecting statistics for the image is currently up to you.

There is an example app for how to pass the data around effiiciently in yavta. It started as one of the standard V4L2 test apps, but has gained the relevant integration into MMAL. I've mainly been testing it with sources producing YUYV or RGB formats, so it hasn't got example white balance. I have done that in raspiraw, so might see if I can transplant the code easily.
Thanks, I suspected my data was a bit out of date! Hadan't realised the ISP component had been merged.

Re: How to use imx334 not imx219 work with pi?

Posted: Thu Oct 11, 2018 11:09 am
by 6by9
jamesh wrote:
Thu Oct 11, 2018 11:03 am
Thanks, I suspected my data was a bit out of date! Hadan't realised the ISP component had been merged.
Almost 18 months ago :shock: Do keep up! Admittedly it's mainly been used as a general purpose resize/format converter for Chromium when doing video_decode.
White balance and digital gain were only added a year ago.