chametl wrote:Yes, with the same lenses, let me see if I can upload an example. Basically what I did was to remove the lens and reverse them,
You reverse it?! So optically very different to the normal configuration and therefore making any assumptions about pixel vignetting based on the lens are pretty much null and void. If you wear glasses, do they work the same backwards as forwards?
chametl wrote:then light them with a green or yellow LED and the center of the image appears a circle like patter of one of this colors. As far as I know, the V2 has a shading correction algorithm but I don't know if it is something related to this.
Both sensors have pixel vignetting / lens shading compensation enabled. V1 has a static table, V2 has a dynamic algorithm that tries to compensate for some variation, but is still operating within a configured window as there is no calibration phase (it would require an external setup to do one).
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
I'm not interested in doing contracts for bespoke functionality - please don't ask.