I am experimenting with a 360° camera head based on the raspberry pi cameras. I now have an issue where we see changes in color from the camera sensors depending on the lighting in the scene, even though everything is set up as fixed. We make the mmal settings from a custom application, so there is no exact command line to show, but the relevant settings are equal to the following:
raspivid -v -t 0 -w 1440 -h 1080 -ex off -awb off -awbg 1.5,1.4 -fps 40 -ss 24830 -b 10000000
(verified by A/B testing and it looks exactly the same)
Take a look at these three images, illustrating our situation:
What you see in these images is a composition (stitch) of the image from three cameras. You can see the blending between the images as a "Y" shape. There is no rotation or other change in the relative position of each camera on the composited image between the three. What is changed is a rotation of the camera head (the three cameras are fixed to the camera head). The black/white line is a wall corner, which is vertical in the real world. There are two white A3 papers taped to the wall.
So, the issue/problem that we see is this: as we move a camera and expose it to a differently colored environment we see a change in the color from the camera even though we set all settings to be fixed and to disregard such input.
Sometimes things like this happens because our vision system is not reliable, so we made some measurements on the white paper. For one of the cameras the paper has the following colors in the three images:
So it seems there really is a change in the color from the camera.
Is there something I'm not setting as fixed that can be set as fixed?