I'm interested in using a pi zero GPU to encode video from the raspberry pi camera, then shuffle the video data off over a (USB.. or UART) 4G modem. It would be a useful device for streaming video from a vehicle to a remote PC for experimenting with machine learning etc.
I'm sure this is achievable using Rasbian (although I can't currently get my camera module to respond..), but it's appealing to do it with potentially a few hundred lines of code and a few C libraries, rather than a full linux setup, and possibly more robust.
There's a few bare metal environments I've come across for the pi (circle, ultibo, RTEMS etc.). I would appreciate any input about the most promising approach to tackle the project. I have experience with embedded development on STM32's, C with RTOS etc., but no bare metal experience with the rpi.
I understand there are C libraries to interface with the encoder hardware (OMX?). The TCP/network stack could probably be handled by the modem (socket connection to a remote server set up with modem AT commands), although I'm not sure how 'heavy duty' such implementations are.