I made my first try today in a Hackaton. I had a few parts to try and lots of brilliant minds with me.
My goal was to demo the PiP with the
Arducam and a desktop captured in a way or another to simulate the target (desktop shared by HDMI2Whatever, and video of the speaker inserted in the top right part of the desktop).
I used
v4l2rtspserver to stream the camera with great success. The Pi was not doing much. As for the sound, we used several USB sound cards with an HF mic. The sound was synchronised with the image, which was quite a nice surprise. Unfortunately, the sound cards gave a terrible echo, as if I was talking from the grave. We had better results with the Jabra 510, but it's so pricey and heavy I would like to get rid of it for the "kit". We also tried an USB mic, which was perfect, except the cable was only 1m long...
Then the HDMI2Ethernet device did not keep its promises, so we decided to use ffmpeg to capture the desktop (PPT slides) and send it to the Pi through the network. Not so much PlugAndPlay afterall...
The Pi managed to do the PiP with the desktop stream and the camera stream. I had some troubles to use directly /dev/video0 and the desktop stream (I can't remember the error for the time being).
I made a try with
v4l2compress_omx to have a compressed stream in the hope of making the PiP easier.
The tool didn't give me any error, but as a check (before doing the PiP with ffmpeg), I tried to launch v4l2rtspserver on this new video device, and never got any picture.
So we went back to the first solution, v4l2rtspserver to stream the camera, and we made the PiP on the presenter PC with ffmpeg. The resulting video was then seen as a camera by the browser, and we were then able to stream to the video to Circuit (which can then broadcast it to everybody and save it for later too).
We wanted to use the
circuit live cam bot on another Pi, but we did not manage to.
For the demo at the end, we went fully wireless with a battery and Wifi and filmed the audience with the Pi while demoing. It was not that bad.
The experiment is not fully satisfactory, but we now have some proof, new ideas and hints to continue.
If I can get a little bit of budget for that, I'd love to get an audio hat, and the HDMI2CSI converter. That would mean two Raspberries, but why not... Or using that kind of
adapter card to have a camera and an HDMI2CSI converter at the same time?
Would the performance degrade?
By the way, what would be the advantage of using a compute node instead of a Pi 3B+?
Thanks.