Hi, I'm not sure if this is the right place to post this question but I am working on a project that will use the GPU for time sensitive IR communications over GPIO on headless pi Zeros. This decision was made to prevent the pi's Linux scheduler from interfering with IR data flow since it requires reliable and accurate timing on the level of 100-200 μs. However the project may eventually be used to take stills via a pi cam. My question is if the camera module requires GPU access for simply saving stills? If so, will saving a still consume enough QPUs to interfere with timing of an IR emitter and an IR receiver?