User avatar
peepo
Posts: 305
Joined: Sun Oct 21, 2012 9:36 am

New features for raspiyuv

Mon Jun 10, 2013 3:14 pm

please could raspiyuv match the raspistill features?

in particular timelapse support.

jamesh is it an error to imagine that YUV data straight from the sensor should be able to be blat or piped at >2fps and perhaps even 90fps for some formats?

jamesh
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 24617
Joined: Sat Jul 30, 2011 7:41 pm

Re: New features for raspiyuv

Mon Jun 10, 2013 3:19 pm

I do not think you will get high frame rates out of YUV, in fact less than JPEG, since the data requirement is increased.

At the moment we do not have any modes > 30fps working, but I do not expect any stills modes to get above 2-3-4 fps for full res. With lower res you will get a higher frame rate, but not above 15fps I would have thought even at low resoltuion - but I've never tried it so that is a guess. I've sure people will test it at some point if they haven't already.

High frame rates are what video mode is for. 1080p is pretty decent resolution for most high frame rate purposes.
Principal Software Engineer at Raspberry Pi (Trading) Ltd.
Contrary to popular belief, humorous signatures are allowed. Here's an example...
“I own the world’s worst thesaurus. Not only is it awful, it’s awful."

User avatar
dozencrows
Posts: 172
Joined: Sat Aug 04, 2012 6:02 pm

Re: New features for raspiyuv

Sat Jun 15, 2013 6:06 pm

I have run into a problem with repeated YUV still capturing that I speculate might be related to this post...

I've been trying to add an option to capture YUV directly from the still port in my fork of motion (https://github.com/dozencrows/motion/bl ... /mmalcam.c), but have found that it only ever captures a single frame. So my question is could there be some limitation on raw YUV still capture that prevents repetition without resetting the camera? I hope not, and it's just a mistake on my part... ;)

The code sets up the camera component, configures and enables the still port to YUV format (I420). No preview or encoder components are created. I have a callback set on the still port that pushes the buffer it receives onto an MMAL queue; the main application thread waits on this queue for each full image buffer, then copies the result to an image that the rest of motion uses for its processing. It then returns the original buffer to its pool, sends it back to the still port and retriggers capture (after ensuring at least 100ms has passed since the last capture trigger).

All that happens is that I get the first buffer come through the callback into the queue and thus to the main application thread; after that, no more callbacks or buffers. I've tried increasing the delay between captures to 5 seconds, which made no difference. I've studied the camera setup and timelapse code in raspistill, and can't see that I've missed anything. And the basic model of camera callback -> queue -> main thread works fine with the video port.

Any thoughts or advice would be gratefully received!

Return to “Camera board”