I've been hacking at the "hello_jpeg" example included in the Raspberry Pi firmware and am struggling with getting an "image_encode" example to work. I'm not an OpenMax IL expert, so it's been a miserable struggle.
Can somebody help me with a simple C example of using the IL library to load an image file, modify it, and reencode it back to JPEG?
The motivation for this example is that I've been using a cheapo USB webcam with my Raspberry Pi, but the framerate is frankly miserable. I want to make a portable, WiFi enabled USB, like http://jeremyblythe.blogspot.com/2012/0 ... ebcam.html
I poked around the source code for http://www.lavrsen.dk/foswiki/bin/view/Motion/WebHome
. Motion converts YUVY formatted frames to mjpeg but does so using the general CPU. mjpg-streamer can pull mjpg straight from the camera, but lacks Motion's ability to draw a timestamp on the frames.
I'm not sure if I care which webcam software I ultimately use, but I'd love to have timestamps on the frames but not at the expense of poor framerates. Should I even expect much better frame rates using OpenMax IL's image_encode library to do the YUVY to jpeg conversion outside of the general CPU?
Is it even possible to use the OpenMax IL JPEG encoder component (to use the framework's vernacular)? https://github.com/xbmc/xbmc/blob/maste ... XImage.cpp
seems to imply use of the "image_encode" component for the Raspberry Pi, but I also had a terrible time applying that code to my use case.
I'd appreciate any thoughts, concerns, or heck, even words of encouragement.