HermannSW
Posts: 829
Joined: Fri Jul 22, 2016 9:09 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Apr 20, 2018 8:10 am

I will start working on providing high framerate options to raspiraw for v2 camera (besides --fps) soon.

As a first quick test I wanted to follow up on 424fps recodings (640x480, but only top 215 lines got updated) posted earlier in this thread:
viewtopic.php?t=109137&start=425#p1274431

I increased fps up to 840fps, when I noticed that there recording speed dropped to 400something fps.

So I went back to 800 fps recording, and frameskips where in 1% range. I did not have the enhanced analysis program from referenced posting with me, will use that from Pi 2B when back at home. This is a frame processed with dcraw and converted to .png:
Image

As I said only the top 100 lines get updated, and I need brighter light on the scene (800fps is 1.25ms frame time only). Post processing can easily extract top 640x100 line frames for final video creation.

But this experiment gives a very exciting number:
100 lines (of length 640) at 800fps!

My up to now favorite raspiraw capturing tool is "640x128_s" for v1 camera. That captures 64 lines out of 128, doubles the lines to 640x128 frames in post processing, and that at 665fps.

For the same format by above number 800*(100/64)=1250fps for v2 camera seem to be possible ...

P.S:
I just noticed that the black lines below that on line 101 are from previous experiments with less than 800fps:
  • 111 lines at 740 fps
  • 134 lines at 640 fps
The frame shows my Pi 3B+, you can (hardly) see camera cable and cable from Raspberry power supply at top of the frame.

P.P.S:
That framerate range is confirmed by v2 camera able to capture 640x480 at 180fps (although that number is with raspivid and GPU):

Code: Select all

$ echo "180*(480/64)" | bc -ql
1350.00000000000000000000
$ 

v2 camera can do 640x480 at 240fps with raspiraw, that would point to even higher framerate:

Code: Select all

$ echo "240*(480/64)" | bc -ql
1800.00000000000000000000
$ 
bookmark list: https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/fork-raspiraw      https://github.com/Hermann-SW/userland
https://github.com/Hermann-SW/wireless-control-Eachine-E52-drone      https://twitter.com/HermannSW

HermannSW
Posts: 829
Joined: Fri Jul 22, 2016 9:09 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Apr 25, 2018 7:05 pm

OK, I really started with v2 camera raspiraw high framerate work.
I did merge 6by9's raspiraw master branch and will commit/push when all changes needed are available for a pull request:
https://github.com/Hermann-SW/raspiraw/ ... 7d73babb7e

Yesterday evening I made "--height" option of raspiraw that worked for v1 camera only sofar working for v1 and v2. A first test showed that I was able to record at 968fps!
https://twitter.com/HermannSW/status/988873779978915840

Today I found and fixed some issues. Biggest surprise was that there is an upper limit of 1007fps for v2 camera chip (like the 200fps capping of raspivid). "--fps" option of raspiraw is always requested framerate, and frame delay analysis tells effective framerate. No matter what resolution and framerate requested, effective framerate was 1007fps at maximum. I will accept that for now ;-)

Next I took videos starting with 640x128 at 1007fps going down with "--height". Always black line showed up after 75 rows. I placed a screw driver on the scene and took andother recording. Only the first 75 rows showed the screw driver confirming that only those 75 lines got updated.

Next I took a 1.2s video at 1007fps, and quickly moved screw driver left and right above some Lego pieces. This is a single 640x75 frame processed with dcraw and converted to .png:
Image
You can see the border on screw driver between metallic part and black front part:
Image

Before more details, here is the video extracted as described further below:
(played at 30fps, 33.5 times slower than real)
https://www.youtube.com/watch?v=z84FEsL ... e=youtu.be
Image


This is execution of new 640x75 tool:

Code: Select all

[email protected]:~/raspiraw/t $ ./640x75 1200
removing /dev/shm/out.*.raw
capturing frames for 1200ms with 1000fps requested
1190 frames were captured at 1007fps
frame delta time[us] distribution
      1 
      2 992
    961 993
    212 994
     11 1986
      2 1987
      1 2979
after skip frame indices (middle column)
1986,103,3357773787
1986,204,3357875091
1986,406,3358076706
1987,643,3358313083
1986,656,3358326987
1986,665,3358336919
1986,674,3358346851
1986,683,3358356782
1986,692,3358366714
1986,704,3358379625
1986,710,3358386578
2979,735,3358413393
1987,802,3358480929
1986,1106,3358783847
1% frame skips
[email protected]:~/raspiraw/t $

Only 14 frameskips in 1190 frames taken is 1%. The average framerate of video determined from first and last timestamp is 1005.7fps, while maximally observed delay of 993μs between frames (961/1190) corresponds to 1007.05fps:

Code: Select all

[email protected]:~/raspiraw/t $ head -1  tstamps.csv 
,1,3357671490
[email protected]:~/raspiraw/t $ tail  -1  tstamps.csv 
993,1190,3358867274
[email protected]:~/raspiraw/t $ echo "(3358867274-3357671490)/(1190-1)" | bc -ql
1005.70563498738435660218
[email protected]:~/raspiraw/t $

I did cut frames 300-699 from the recorded frames and created .ogg and .anim.gif videos (playing at 30fps, 33.5 times slower than real) from recorded frames with raw2ogg2anim tool:

Code: Select all

[email protected]:~/raspiraw/t $ raw2ogg2anim t 300 699 30
removing old auxiliary files
copying /dev/shm/out.????.raw files
699     
dcraw each .raw file (to .ppm)
out.0699.raw     
.ppm -> .ppm.d
out.0699.ppm     
.ppm.d -> .ppm.d.png
out.0699.ppm.d     
now creating t.ogg
Setting pipeline to PAUSED ...
Pipeline is PREROLLING ...
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
Got EOS from element "pipeline0".
Execution ended after 0:00:15.442299713
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
now creating t.anim.gif
[theora @ 0x1e967f0] 7 bits left in packet 82
[ogg @ 0x1e955c0] Broken file, keyframe not correctly marked.
[theora @ 0x1f3a1b0] 7 bits left in packet 82
[ogg @ 0x1e955c0] Broken file, keyframe not correctly marked.
    Last message repeated 3 times
[theora @ 0x1975880] 7 bits left in packet 82
[ogg @ 0x1974700] Broken file, keyframe not correctly marked.
[theora @ 0x1a28510] 7 bits left in packet 82
[ogg @ 0x1974700] Broken file, keyframe not correctly marked.
    Last message repeated 3 times
done
[email protected]:~/raspiraw/t $

Find the animated .gif and youtube link above.
7 of the 14 frameskips are in 300-699 range.
Average framerate for the cut out video is even better, 1010fps:

Code: Select all

[email protected]:~/raspiraw/t $ grep ,300, tstamps.csv 
993,300,3357970436
[email protected]:~/raspiraw/t $ grep ,699, tstamps.csv 
994,699,3358373667
[email protected]:~/raspiraw/t $ echo "(3358373667-3357970436)/(699-300)" | bc -ql
1010.60401002506265664160
[email protected]:~/raspiraw/t $ 

Next step is to make vertical increment raspiraw option work for v2 camera as well. That will allow for 640x150_s tool, recording 75 every other lines at 1007fps, resulting in 640x150 video after line doubling in post processing!
bookmark list: https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/fork-raspiraw      https://github.com/Hermann-SW/userland
https://github.com/Hermann-SW/wireless-control-Eachine-E52-drone      https://twitter.com/HermannSW

HermannSW
Posts: 829
Joined: Fri Jul 22, 2016 9:09 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Apr 29, 2018 7:55 pm

OK, I added "--voinc" option for only v2 camera (imx219).

I took 1000fps video with 640x75, this is one frame:
Image


Now same scene, 640x150_s frame:
Image


OK, frame height doubled, but images are really pale (different lens than in last posting).
I did light this scene with 1000lm light as always.


Today I was able to test new 5000lm light with LED driver that has no powerline issues:
viewtopic.php?f=66&t=209041&p=1309474#p1309474

I used the 5000lm light for same scene, much better 640x75 frame:
Image


And here is the corresponding 640x150_s frame:
Image


This frame has double height, but looks not as smooth as the previous one.
For both frames only 75 lines were recorded, one with "--voinc 01" and the other with "--voinc 03".
In post processing (after dcraw) lines were doubled for 640x150_s.
Obviously quality is worse for 640x150_s frame, but hey, for 1000fps frame and 150 lines not that bad!


I am not completely done, therefore I will not commit/push right now.
If you want to play with 1000fps, then you need to sync raspiraw branch and apply the diff as patch.
The two new tools 640x75 and 640x150_s are contained in attached .zip as well.


Now there are three kinds of high framerate options.

Working for v1 camera only:

Code: Select all

        if (cfg.vinc >= 0)
        {
                if (!strcmp(sensor->name, "ov5647"))
                        modReg(sensor_mode, 0x3815, 0, 7, cfg.vinc, EQUAL);
        }

Working for v2 camera only:

Code: Select all

        if (cfg.voinc >= 0)
        {
                if (!strcmp(sensor->name, "imx219"))
                        modReg(sensor_mode, 0x0171, 0, 2, cfg.voinc, EQUAL);
        }

Working for v1 and v2 camera:

Code: Select all

        if (cfg.height > 0)
        {
                sensor_mode->height = cfg.height;
                modReg(sensor_mode, sensor->yos_reg, 0, 3, cfg.height >>8, EQUAL);
                modReg(sensor_mode, sensor->yos_reg, 0, 7, cfg.height &0xFF, EQUAL);
        }
Attachments
1000fps.1.zip
1000fps.1.zip
(2.82 KiB) Downloaded 43 times
bookmark list: https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/fork-raspiraw      https://github.com/Hermann-SW/userland
https://github.com/Hermann-SW/wireless-control-Eachine-E52-drone      https://twitter.com/HermannSW

HermannSW
Posts: 829
Joined: Fri Jul 22, 2016 9:09 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Apr 30, 2018 9:15 pm

I took a 640x150_s video at 1000fps again.

This is a 75 line frame of every other line after applying dcraw to raw Bayer file:
Image


This is same frame after doubling each of the 75 lines:
Image


Just as an experiment I wrote a program to modify the raw Bayer file and double each line there, then apply dcraw to the doubled raw file:
Image


Unfortunately that is even worse than the previous 640x150 frame with lines doubled after dcraw.
bookmark list: https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/fork-raspiraw      https://github.com/Hermann-SW/userland
https://github.com/Hermann-SW/wireless-control-Eachine-E52-drone      https://twitter.com/HermannSW

HermannSW
Posts: 829
Joined: Fri Jul 22, 2016 9:09 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Tue May 01, 2018 10:15 pm

New separate thread "v2 camera can capture 1000fps videos ..." on this:
viewtopic.php?f=43&t=212518
Image
bookmark list: https://stamm-wilbrandt.de/en/Raspberry_camera.html

https://github.com/Hermann-SW/fork-raspiraw      https://github.com/Hermann-SW/userland
https://github.com/Hermann-SW/wireless-control-Eachine-E52-drone      https://twitter.com/HermannSW

fooforever
Posts: 23
Joined: Wed Aug 15, 2012 11:21 pm

Re: Raw sensor access / CSI-2 receiver peripheral

Wed May 02, 2018 3:09 pm

Great work everyone whos contributing to raspiraw in this thread! :)
I've got a program based off raspiraw that I'm having some trouble with, it basically is able to get raw sensor data from the camera and send it over the network to a different pc. The problem I'm having with it is that I don't seen to be able to get streaming frames from the sensor, I might get 2 or 3 then they are repeated in order.
The program is a modified version of raspiraw (a slightly old version now), where I've modified the callback to put a frame into a queue when request so that the server program loop can receive it and send it out on the network. There must be something going wrong with the how I'm handling the buffers or something, but I don't know mmal that well to track it down. Can anyone help?

Here's the callback with a few relevant globals:

Code: Select all

int running = 0;
bool send_frame = false;
MMAL_QUEUE_T *queue = NULL;
MMAL_PORT_T *returnPort = NULL;

static void callback(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buffer)
{
	if(running)
	{

		if(!(buffer->flags&MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO) && send_frame)
		{
			vcos_log_error("Buffer %p returned, filled %d, timestamp %llu, flags %04X", buffer, buffer->length, buffer->pts, buffer->flags);
			send_frame = false;
			returnPort = port;
			mmal_queue_put(queue, buffer);
		} else {
		buffer->length = 0;
		mmal_port_send_buffer(port, buffer);
		}
	}
	else
		mmal_buffer_header_release(buffer);
}
And here's the section of the server() loop that requests a frame and receives it from the queue:

Code: Select all

if(strncmp(buffer, &commands[0][0], 4) == 0) {
	//CAP1
	send_frame = true;
	workingBuffer = mmal_queue_wait(queue);
	bzero(message, 100);
	sprintf(&message[0], "%s%.8lu%s", headerchar, (long int) (sensor_mode->height * sensor_mode->width * (cfg.bit_depth / 8)) + sizeof(endchar), returnchar);
	printf("Message string: %s", message);
	write(cli_sockfd, message, strlen(message));
	write(cli_sockfd, workingBuffer->data + 74, (sensor_mode->height * sensor_mode->width * (cfg.bit_depth / 8)));
	workingBuffer->length = 0;
	mmal_port_send_buffer(returnPort, workingBuffer);
	write(cli_sockfd, endchar, sizeof(endchar));
}
The queue is created and streaming started in the main function:

Code: Select all

	queue = mmal_queue_create();
	start_camera_streaming(sensor, sensor_mode);
	server();
	stop_camera_streaming(sensor);
	mmal_queue_destroy(queue);
	printf("exit nice");
All the network code and other functions of raspiraw work so there must be a problem in these sections how I'm dealing with frames

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5944
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Wed May 02, 2018 3:26 pm

fooforever wrote:
Wed May 02, 2018 3:09 pm
Great work everyone whos contributing to raspiraw in this thread! :)
I've got a program based off raspiraw that I'm having some trouble with, it basically is able to get raw sensor data from the camera and send it over the network to a different pc. The problem I'm having with it is that I don't seen to be able to get streaming frames from the sensor, I might get 2 or 3 then they are repeated in order.
<snip>
Can I suggest you start a new thread? This may get involved, and details will just get lost on this huge thread.

Nothing springs to mind on the problem itself, although I haven't picked through the code in detail.
Enabling logging might show something up. First step would be ARM-side MMAL with

Code: Select all

export VC_LOGLEVEL="mmal:trace"
Second step would be VideoCore side

Code: Select all

vcgencmd logging set_level=0xc0
<run app>
sudo vcdbg log msg
Also worth enabling asserts on VideoCore too by adding "start_debug=1" to /boot/config.txt, and then "sudo vcdbg log assert" after running the app.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

luiscgalo
Posts: 29
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jun 22, 2018 9:30 am

Hi all,
This is my first post so please forgive me if I'm writing on the wrong thread.

I'l trying to build an application on the Raspberry Pi 3B+ to receive CSI-2 raw data coming from a TC358743 chip (B101 module).

Currently, I'm playing around with a simple C application that creates a "vc.ril.rawcam" MMAL component with callback function to retrieve raw data from the CSI-2 bus.
This simple application is working with success and I'm able to receive raw video data from the TC358743 chip.

My problem is related interlaced video fields - in my case I'm trying to capture 1080i50 (25fps interlaced)
According to TC358743's functional specification, it is possible to set a different data type ID for top and bottom fields of the video in order to distinguish them on the CSI-2 data.

This text below is the small fraction of the functional specification of the chip talking about this:
For interlace video, users should program their desired interlace stream DataID in
register field PacketID1.VPID0 and PacketID1.VPID1 for top and bottom field,
respectively.
VPID0 For Interlaced frame Top field
VPID1 For Interlaced frame Bottom field


Based on this information, I've set VPID1 to 0x34 and VPID0 to 0x35 (default values on Toshiba chip).
However, I don't know how I can read that value on my "vc.ril.rawcam" callback.
I've dump all available info from MMAL_PORT_T *port and MMAL_BUFFER_HEADER_T *buffer callback parameters without success (I'm not able to find anything containing the 0x34 or 0x35).

Since there is no much information about MMAL on the Internet, I would appreciate your expertise to help me understanding how can I distinguish the top and bottom field data coming from the TC358743 chip.

Thanks in advance for your help.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5944
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jun 22, 2018 12:27 pm

luiscgalo wrote:
Fri Jun 22, 2018 9:30 am
Hi all,
This is my first post so please forgive me if I'm writing on the wrong thread.

I'l trying to build an application on the Raspberry Pi 3B+ to receive CSI-2 raw data coming from a TC358743 chip (B101 module).

Currently, I'm playing around a simple C application that creates a "vc.ril.rawcam" MMAL component with callback function to retrieve raw data from the CSI-2 bus.
This simple application is working with success and I'm able to receive raw video data from the TC358743 chip.

My problem is related with a small detail regarding interlaced video - in my case I'm trying to capture 1080i50 (25fps interlaced)
According to TC358743's functional specification, it is possible to set a different data type ID for top and bottom fields of the video in order to distinguish them on the CSI-2 data.

This text below is the small fraction of the fucntional specification of the chip talking about this:
For interlace video, users should program their desired interlace stream DataID in
register field PacketID1.VPID0 and PacketID1.VPID1 for top and bottom field,
respectively.
VPID0 For Interlaced frame Top field
VPID1 For Interlaced frame Bottom field


Based on this information, I've set VPID1 to 0x34 and VPID0 to 0x35 (its default values on Toshiba chip).
However, I don't know how I can read that value on my "vc.ril.rawcam" callback.
I've dump all available info from the MMAL_PORT_T *port and MMAL_BUFFER_HEADER_T *buffer callback parameters without success.

I would appreciate your expertise to help me understanding how can I distinguish the top and bottom field data coming from the TC358743 chip.

Thanks in advance for your help.
Not easily, and it's not a use case that I have any real intention of solving. All my work on TC358743 is for progressive formats only.

The CSI-2 receiver peripheral has two receive paths, one for anything that matches the programmed data type field, and one for everything that doesn't. However that second path is really intended for the embedded metadata that many sensors kick out before the image data to provide their register set. Because of that, none of the software is set up to listen for the end of frame interrupt on that second pipe, and it just picks up the buffer at the end of the main image. Buffer swapping likewise is set up to swap both buffers as a pair.

rawcam will deliver both buffers to your app, with the MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO flag set on the metadata buffer. Whether the timing works out I have no idea.

There is one config parameter that will affect it which is embedded_data_lines in MMAL_PARAMETER_CAMERA_RX_CONFIG_T. You'll want to set that to the image height (or does the TC358743 send out half height for each field?) otherwise your image will be seriously truncated (the default is 1 line).
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

luiscgalo
Posts: 29
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jun 22, 2018 12:44 pm

Hi 6by9,
Thanks for your quick feedback.

Indeed, you are right.
During my tests, I've noticed that I'm currently receiving callbacks (video frames) with its flags being set to 0x04, 0x84, 0x04, 0x84 ...
So, if I understood this, it seems that one field is being sent as simple "end frame" (0x04) and the second field as metadata - with MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO flag set (0x84).

Maybe I can distinguish if it is a top or bottom field by reading the MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO flag.
What do you think? :)
There is one config parameter that will affect it which is embedded_data_lines in MMAL_PARAMETER_CAMERA_RX_CONFIG_T. You'll want to set that to the image height (or does the TC358743 send out half height for each field?) otherwise your image will be seriously truncated (the default is 1 line).
Well, the buffer size received in each "rawcam" callback is set to 3133440.
According to my math (please correct me if I'm saying anything wrong), this gives a field resolution of 1920x544 pixels x 3 RGB888 = 3133440.
From this, I can assume that each callback comes with half vertical resolution (1920x540) of the video feed (FullHD 1920x1080) plus 4 unknown vertical lines.

Regarding progressive vs interlaced video.
Yes, I agree with you and I also prefer progressive but in this application I'm capturing a live video feed from a camcorder which only outputs 1080i50.

During the next days, on my free time, I'll make some more tests to see the influence of the embedded_data_lines parameter on this.

One last question (for now :)), in this situation, what is the purpose of the "image_id" (also set on MMAL_PARAMETER_CAMERA_RX_CONFIG_T)?
What is the influence of this parameter for the "rawcam" callbacks?

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5944
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jun 22, 2018 1:36 pm

luiscgalo wrote:
Fri Jun 22, 2018 12:44 pm
Indeed, you are right.
During my tests, I've noticed that I'm currently receiving callbacks (video frames) with its flags being set to 0x04, 0x84, 0x04, 0x84 ...
So, if I understood this, it seems that one field is being sent as simple "end frame" (0x04) and the second field as metadata - with MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO flag set (0x84).

Maybe I can distinguish if it is a top or bottom field by reading the MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO flag.
What do you think? :)
If you've set image_id to 0x34, then buffers without MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO should be the top field, and those with MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO should be the bottom field.
Don't enable audio or InforFrames over CSI2 as otherwise those will also get merged into the MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO buffer.
luiscgalo wrote:One last question (for now :)), in this situation, what is the purpose of the "image_id" (also set on MMAL_PARAMETER_CAMERA_RX_CONFIG_T)?
What is the influence of this parameter for the "rawcam" callbacks?
It's the CSI-2 data type value for the image buffers. Set it to 0x34 in your case to pick up the top field in the image buffers.

I take it you've found https://github.com/6by9/raspi_tc358743/ ... tc358743.c. That's the test app that got developed off this thread and viewtopic.php?f=38&t=120702 for driving the TC358743 (originally it was a branch on my userland repo). It's a reasonable starting point for getting the chip running, and shouldn't require too much tweaking to get interlaced working.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

luiscgalo
Posts: 29
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Jun 27, 2018 12:31 pm

Hi again
Thanks 6by9 for your tips.

During the last weekend I've been playing around with your suggestions in order to be able to capture video frames from the TC358743 chip on the Raspberry Pi.
Regarding progressive video frames (when I've tested with a 1080p25 source), everything is working fine.
My problem is trying to adapt the source code in order to retrieve interlaced field frames for a 1080i50 source.

I still have doubts regarding the operation of the TC358743 where you maybe able to clarify:

1. Even if the video source is progressive, the "rawcam" callback is always triggered two times for each frame with flag parameter set to 0x04 and 0x84 (MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO) and with the buffer size set to the full video frame resolution (1920*1080*3 = 6220800).
Of course I understand that when the buffer flag is set to 0x04 it is a image buffer containing the RGB data.
What I do not understand is why the callback is also invoked (when capturing progressive video) with flag set to 0x84 with the same buffer size.


2. On the previous post you've referred the following:
Don't enable audio or InforFrames over CSI2 as otherwise those will also get merged into the MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO buffer.
That makes sense since I want only video data being exchanged on the CSI-2 bus.
However I don't know which register I should set on the TC358743 to ensure that no audio or info frames are being sent via CSI-2.
Can you please point me which I should change to control this?


3. This third question is quite related with my other doubts... It is regarding the lack of information about the TC358743 chip.
I found some references on the Internet referring a Functional Specification document for this chip. That's how I've found that VPID0 and VPID1 parameters are used to distinguish the interlaced video fields.
However I don't have that document and without it is almost impossible to understand which I2C registers I should set to properly configure the TC358743.
It seems that the Auvidea store where I bought the B101 module is a hardware-only company and does not have/provides me that document.
I really need some help to understand the functionalities of the TC358743 chip in order to be able to use it on my application...
Do you have any technical information regarding the TC358743 that you maybe able to share with me?

Thanks again for your valuable help.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5944
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Jun 27, 2018 12:54 pm

luiscgalo wrote:
Wed Jun 27, 2018 12:31 pm
Hi again
Thanks 6by9 for your tips.

During the last weekend I've been playing around with your suggestions in order to be able to capture video frames from the TC358743 chip on the Raspberry Pi.
Regarding progressive video frames (when I've tested with a 1080p25 source), everything is working fine.
My problem is trying to adapt the source code in order to retrieve interlaced field frames for a 1080i50 source.

I still have doubts regarding the operation of the TC358743 where you maybe able to clarify:

1. Even if the video source is progressive, the "rawcam" callback is always triggered two times for each frame with flag parameter set to 0x04 and 0x84 (MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO) and with the buffer size set to the full video frame resolution (1920*1080*3 = 6220800).
Of course I understand that when the buffer flag is set to 0x04 it is a image buffer containing the RGB data.
What I do not understand is why the callback is also invoked (when capturing progressive video) with flag set to 0x84 with the same buffer size.
You really need to refer back to the CSI2 spec and understand a bit more as to how the lower level stream is packetised. Google for "MIPI_Alliance_Specification_for_Camera_Serial_Interface_2__CSI_2_.pdf", and then sections 9.8-9.10.
Start of frame is when we receive a "Frame Start Code" short packet.
You'll then get a "Line Start", followed by the line of data with appropriate data_type (image_id) value, and terminated with a "Line End". Repeat for each line of the frame.
Finally you'll get a "Frame End Code" short packet. The receiver doesn't keep a note of which data_type the data was received on, it just returns both buffers saying "I got a frame end, there may be data in here". rawcam returns the buffers to you at this point.
(Technically it does have write pointer registers, so you can tell how much data was written to each, but that then becomes fun if there is any interrupt latency).
luiscgalo wrote:2. On the previous post you've referred the following:
Don't enable audio or InforFrames over CSI2 as otherwise those will also get merged into the MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO buffer.
That makes sense since I want only video data being exchanged on the CSI-2 bus.
However I don't know which register I should set on the TC358743 to ensure that no audio or info frames are being sent via CSI-2.
Can you please point me which I should change to control this?


3. This third question is quite related with my other doubts... It is regarding the lack of information about the TC358743 chip.
I found some references on the Internet referring a Functional Specification document for this chip. That's how I've found that VPID0 and VPID1 parameters are used to distinguish the interlaced video fields.
However I don't have that document and without it is almost impossible to understand which I2C registers I should set to properly configure the TC358743.
It seems that the Auvidea store where I bought the B101 module is a hardware-only company and does not have/provides me that document.
I really need some help to understand the functionalities of the TC358743 chip in order to be able to use it on my application...
Do you have any technical information regarding the TC358743 that you maybe able to share with me?
And there is the problem. Toshiba only release the datasheet under NDA (Non-Disclosure Agreement), and that only normally happens if you promise them sales.
We have an NDA with them, partly for this chip (although we never sold a product using it), and partly for the one used in the 7" DSI display. I can't send that datasheet on though.

You can reference the Linux kernel driver for the chip, which does include a list of defines for the registers.
That driver does have some support for interlaced video (reads VI_STATUS1 and checks bit MASK_S_V_INTERLACE), but it doesn't seem to do that much with it.
For reference, that driver uses I2S for the audio, and reads InfoFrames via I2C, so the register setup doesn't put either down the CSI2 interface. I can't recall if the settings in raspi_tc358743 copied the kernel driver or the firmware driver (which does use audio over CSI2).
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

luiscgalo
Posts: 29
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Jun 28, 2018 9:37 am

Thanks again 6by9 for your patience and tips.

Ok, I've followed your suggestions and I'm finally able to receive valid raw RGB data from the TC358743 chip on the raspberry pi. :)
I've attached 3 items to this post so you can see my results:
- source-code.zip --> it is my test source code for setup the TC358743 and receive data from the CSI-2 bus, saving one raw RGB frame.
- app-console-output.png --> stdout output from my test app
- raw-image.jpg --> result of the visualization of raw RGB data generated by me test app on the http://rawpixels.net/ (very useful website for this!)
EDIT: Since forum is not accepting my attachments, I've created a wetransfer link containing my files: https://we.tl/HVjSRUu5GS

Please note that this application is just a prototype for test purposes only...
So, now it seems that I'm able to capture the odd and even fields of interlaced video successfully (1920x540 of a FullHD video with 1920x1080 pixels).

But (there is always a "but" :? ), I'm not able to distinguish if a received field is top or bottom...
Indeed, from my tests, valid image data is only received when the callback is invoked with buffer flags set to 0x04 (image buffer).
Additionally, the referred callback is invoked two times (buffer flags set to 0x04 and 0x84) at a frequency of 50Hz (20ms period).
This makes sense since we are receiving 25fps video interlaced which means a field frequency of 50Hz.
I think that picture app-console-output.png explains better what I mean.
Basically, every 20ms, there are two "rawcam" callbacks, one with buffer flags set to 0x04 (this is the one with valid RGB data) and a second one set to 0x84 (I don't know the meaning of the data here...).

Resuming, I'm receiving valid RGB data, interlaced fields with 1920x540px, 50 times per second but I don't know which one is top or bottom field.
How can I distinguish the fields?

From my tests, MMAL is ignoring any value that I set on the "image_id" parameter.
I thought that this parameter should be set to one of the VPID packet ids for selecting which data should be received on the "image buffers" or when "MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO" is set.
However I see no difference when I change the "image_id" value...

I've also experimented multiple values on the "embedded_data_lines" parameter, without success (I always see the same behavior)

It is my first contact with the TC358743 chip and Raspberry Pi MMAL library and I'm probably missing some obvious detail here...

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5944
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Jun 28, 2018 12:44 pm

For a raw image viewer I find it hard to beat Vooya. It's free for Linux, but x86 only :(

Using a Pi and forcing the output mode via tvservice -e "CEA 20 HDMI" or tvservice -e "CEA 21 HDMI" (1080i50 and 576i respectively) I've done a quick test using https://github.com/6by9/raspi_tc358743. If I change CSI_IMAGE_ID then I no longer get an image captured. I haven't checked that it's going into the MMAL_BUFFER_HEADER_FLAG_CODECSIDEINFO buffer, but I expect so.
I'm not going to pick through your code, but I've pushed a couple of quick changes to https://github.com/6by9/raspi_tc358743/tree/interlaced that mean the standard raspi_tc358743 tells you about interlaced input. Using that I get a flickery image displayed if I set CSI_IMAGE_ID to either 0x34 or 0x35, and nothing on any other value.

What I suspect is happening is that the TC358743 is sending
- Frame Start
- Top field with data type VPID0
- Frame End
- Frame Start
- Bottom field with data type VPID1
- Frame End
<repeat>
As stated, the receiver doesn't track which data types have been received, and the FS/FE packets don't identify that either, therefore you get a pair of buffers back.

In fact looking at the VPU logging I can confirm this. We log whenever we get a FS, FE, and every (height/4) lines.

Code: Select all

1048666.152: unicam_int_callback: Frame Start, time = 1048666115, frame_int_req=1, frame_int_ready=0
1048671.591: unicam_int_callback: bytes_written = 787424, lines_done = 136
1048671.600: unicam_int_callback: Line Interrupt 136
1048676.427: unicam_int_callback: bytes_written = 1570704, lines_done = 272
1048676.440: unicam_int_callback: Line Interrupt 272
1048681.261: unicam_int_callback: bytes_written = 2353792, lines_done = 408
1048681.274: unicam_int_callback: Line Interrupt 408
1048686.069: unicam_int_callback: bytes_written = 3110400, lines_done = 540
1048686.077: unicam_int_callback: Frame End
1048686.137: unicam_int_callback: bytes_written = 0, lines_done = 0
1048686.154: unicam_int_callback: Frame Start, time = 1048686120, frame_int_req=1, frame_int_ready=0
1048706.056: unicam_int_callback: bytes_written = 0, lines_done = 0
1048706.064: unicam_int_callback: Frame End
1048706.125: unicam_int_callback: bytes_written = 0, lines_done = 0
1048706.143: unicam_int_callback: Frame Start, time = 1048706107, frame_int_req=1, frame_int_ready=0
1048711.590: unicam_int_callback: bytes_written = 786912, lines_done = 136
1048711.600: unicam_int_callback: Line Interrupt 136
1048716.428: unicam_int_callback: bytes_written = 1570816, lines_done = 272
1048716.441: unicam_int_callback: Line Interrupt 272
1048721.260: unicam_int_callback: bytes_written = 2353296, lines_done = 408
1048721.269: unicam_int_callback: Line Interrupt 408
1048726.069: unicam_int_callback: bytes_written = 3110400, lines_done = 540
1048726.080: unicam_int_callback: Frame End
1048726.143: unicam_int_callback: bytes_written = 0, lines_done = 0
1048726.160: unicam_int_callback: Frame Start, time = 1048726123, frame_int_req=1, frame_int_ready=0
1048746.058: unicam_int_callback: bytes_written = 0, lines_done = 0
1048746.067: unicam_int_callback: Frame End
1048746.134: unicam_int_callback: bytes_written = 0, lines_done = 0
1048746.152: unicam_int_callback: Frame Start, time = 1048746117, frame_int_req=1, frame_int_ready=0
1048751.590: unicam_int_callback: bytes_written = 787104, lines_done = 136
1048751.599: unicam_int_callback: Line Interrupt 136
1048756.426: unicam_int_callback: bytes_written = 1570400, lines_done = 272
1048756.439: unicam_int_callback: Line Interrupt 272
1048761.260: unicam_int_callback: bytes_written = 2353536, lines_done = 408
1048761.278: unicam_int_callback: Line Interrupt 408
1048766.066: unicam_int_callback: bytes_written = 3110400, lines_done = 540
1048766.074: unicam_int_callback: Frame End
1048766.137: unicam_int_callback: bytes_written = 0, lines_done = 0
1048766.153: unicam_int_callback: Frame Start, time = 1048766120, frame_int_req=1, frame_int_ready=0
1048786.058: unicam_int_callback: bytes_written = 0, lines_done = 0
1048786.069: unicam_int_callback: Frame End
1048786.133: unicam_int_callback: bytes_written = 0, lines_done = 0
1048786.152: unicam_int_callback: Frame Start, time = 1048786115, frame_int_req=1, frame_int_ready=0
The time between Frame Start timestamps is 20ms, so it is at the field rate of 50Hz, and every other set show 0 lines of data matching the data type having been received.

I may be able to add code in that reports the buffer length as 0 if no data at all was received, but that is going to confuse any other component that gets the data (eg ISP or video_encode), so I'm not inclined to do so. It doesn't help with the metadata buffer anyway as I can't rely on the write pointer, and there isn't a generic answer as to how much to expect.
My suggestion would be to write a magic value (0xdeadbeef is popular) to the start of the buffers before you submit them to be filled. When the buffer is returned, check for that magic value. If it's not there then data was received and should be processed, otherwise ignore the buffer as empty.

Having had a discussion with a colleague yesterday, we suspect that embedded_data_lines is actually redundant. The peripheral supports generating an interrupt at the end of the embedded data, and as this isn't conveyed in the CSI2 bitstream the peripheral has to be told how many lines to expect. The interrupt isn't used, and therefore it isn't going to do anything useful.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

Kozuch
Posts: 62
Joined: Sun Aug 10, 2014 10:35 am

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Jun 28, 2018 2:45 pm

Is it possible to control line blanking on the sensor somehow? I mean adding/removing a blanking interval line. The feature would come in handy for frame synchronization as it may allow to bring multiple cameras to sync.

luiscgalo
Posts: 29
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Jul 09, 2018 9:14 am

I had very busy days at work and I only had some time for testing during this weekend.

Your suggestion of setting a pattern on the callback buffers was the key to solve the problem of identifying packets containing image data.
I'm now able to capture raw BGR24 data from a 1080i50 source. :)
If someone has the same challenge of capturing interlaced video using the TC358743 on a Raspberry Pi, I can confirm that is possible with proper distinction between the top and bottom fields.
Thanks again 6by9 for your valuable help! ;)

Now the issue of capturing raw video is solved and I have, again, some doubts related with MMAL and the lack of information about it...
Probably it's easier if you take a look to the diagram attached to this post.
It represents my proposed flow of image data from the TC358743 capture using the rawcam until my final goal of recording the video using the H264 hardware encoder of RPi.
issue-mmal.jpg
issue-mmal.jpg (33.8 KiB) Viewed 1483 times
Currently, I'm able to perform the raw capture until getting a complete BGR24 interlaced frame (1920x1080px) by merging the top and bottom fields retrieved using the rawcam component.

My problem is related with color conversion ("vc.ril.isp") and/or usage of the deinterlacer component ("vc.ril.image_fx") - boxes in orange on the diagram.

1. My first question is related with "vc.ril.image_fx" component and the type of encoding accepted on the input.
From the few information available on the Internet I read that it accepts only "opaque" encoding (MMAL_ENCODING_OPAQUE) - but I could be wrong.
What is exactly the opaque encoding? (sorry if this is a very stupid question...)

2. Can I directly provide BGR24 data from the raw video frame to the "vc.ril.image_fx" component input?

3. if not, should I use the "vc.ril.isp" component as an intermediate converter from BGR24 to something accepted by the "vc.ril.image_fx"?

In an trial and error attempt to convert from BGR24 to a different encoding such as I420 I've implemented a simple code that instantiates the "vc.ril.isp" component to perform the referred conversion.
However, when I run that sample code, the output callback of ISP is always invoked with buffer length set to zero (no data written).
I've tried multiple times with different combinations of parameters without a successful color conversion... :(
I'm probably missing a very simple/stupid thing here...
Since there is almost no documentation about how to use MMAL components I don't know how to proceed...
Can your expertise point out again what I'm missing here?

Here is the extract of the source code where I'm trying to perform the conversion from BGR24 to I420

Code: Select all

// This is a raw buffer for storing the Full HD interlaced video frame
uint8_t* punRGBFrame = NULL;
uint32_t unRGBFrameSize = 0;

MMAL_COMPONENT_T *pISP = NULL;
MMAL_PORT_T *isp_input = NULL, *isp_output = NULL;
MMAL_POOL_T *pool_in = NULL;
MMAL_POOL_T *pool_out = NULL;

void ConvertFrame() {
	MMAL_BUFFER_HEADER_T *buffer;
	MMAL_STATUS_T status;

	// send data to the input port of ISP component
	buffer = mmal_queue_get(pool_in->queue);
	if (buffer != NULL) {
		mmal_buffer_header_mem_lock(buffer);
		buffer->length = unRGBFrameSize;
		buffer->flags = MMAL_BUFFER_HEADER_FLAG_FRAME;
		buffer->pts = buffer->dts = MMAL_TIME_UNKNOWN;
		memcpy(buffer->data, punRGBFrame, buffer->length);
		mmal_buffer_header_mem_unlock(buffer);

		printf("sending %i bytes\n", (int) buffer->length);
		status = mmal_port_send_buffer(isp_input, buffer);
		if (status != MMAL_SUCCESS) {
			printf("Error sending data to ISP input!\n");
		}
	} else {
		printf("ISP input buffer not available!\n");
	}
}

void input_port_cb(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buffer) {
	//printf("converter input buffer! SIze=%i\n", buffer->length);
	mmal_buffer_header_release(buffer);
}

void output_port_cb(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buffer) {
	printf("ISP output buffer! Size=%i\n", buffer->length);

	/*
     Output data from converter will be saved in this section....
	 mmal_buffer_header_mem_lock(buffer);
	 mmal_buffer_header_mem_unlock(buffer);
	 */

	mmal_buffer_header_release(buffer);

	// send next output buffer
	if (port->is_enabled) {
		MMAL_BUFFER_HEADER_T *NewBuffer = mmal_queue_get(pool_out->queue);
		if (NewBuffer) {
			mmal_port_send_buffer(port, NewBuffer);
		}
	}
}

void InitConverter() {
	int32_t unInWidth = 1920;
	int32_t unInHeight = 1080;
    MMAL_STATUS_T status;

	// allocate space for RGB24 buffer
	unRGBFrameSize = unInWidth * unInHeight * 3;
	punRGBFrame = (uint8_t*) malloc(unRGBFrameSize);

	status = mmal_component_create("vc.ril.isp", &pISP);
	if (status != MMAL_SUCCESS) {
		printf("Failed to create image converter!\n");
		return;
	}

	isp_input = pISP->input[0];
	isp_output = pISP->output[0];

	isp_input->format->encoding = MMAL_ENCODING_BGR24;
	isp_input->format->es->video.width = VCOS_ALIGN_UP(unInWidth, 32);
	isp_input->format->es->video.height = VCOS_ALIGN_UP(unInHeight, 16);
	isp_input->format->es->video.crop.x = 0;
	isp_input->format->es->video.crop.y = 0;
	isp_input->format->es->video.crop.width = unInWidth;
	isp_input->format->es->video.crop.height = unInHeight;
	//isp_input->format->es->video.par.num = 1;
	//isp_input->format->es->video.par.den = 1;
	//isp_input->format->es->video.frame_rate.num = 0;
	//isp_input->format->es->video.frame_rate.den = 1;
	//isp_input->format->type = MMAL_ES_TYPE_VIDEO;
	//isp_input->format->flags = MMAL_ES_FORMAT_FLAG_FRAMED;
	//isp_input->format->extradata_size = 0;

	status = mmal_port_format_commit(isp_input);
	if (status != MMAL_SUCCESS) {
		printf("Failed to commit converter input format!\n");
		return;
	}

	isp_input->buffer_size = isp_input->buffer_size_recommended;
	isp_input->buffer_num = isp_input->buffer_num_recommended;
	printf("ISP input buffer size %i bytes\n", isp_input->buffer_size);

	// enable DMA (zero copy)
	/*status = mmal_port_parameter_set_boolean(isp_input, MMAL_PARAMETER_ZERO_COPY, MMAL_TRUE);
	 if (status != MMAL_SUCCESS) {
	 printf("Failed to set zero copy on isp input!\n");
	 return;
	 }*/

	// create pool for input data
	pool_in = mmal_port_pool_create(isp_input, isp_input->buffer_num, isp_input->buffer_size);
	if (pool_in == NULL) {
		printf("Failed to create ISP input pool!\n");
	}

	// Setup ISP output (copy of input format, changing only the encoding)
	mmal_format_copy(isp_output->format, isp_input->format);
	isp_output->format->encoding = MMAL_ENCODING_I420;
	status = mmal_port_format_commit(isp_output);
	if (status != MMAL_SUCCESS) {
		printf("Failed to commit converter output format!\n");
		return;
	}

	isp_output->buffer_size = isp_output->buffer_size_min;
	isp_output->buffer_num = isp_output->buffer_num_min;
	printf("ISP output buffer size %i bytes\n", isp_output->buffer_size);

	/*status = mmal_port_parameter_set_boolean(isp_output, MMAL_PARAMETER_ZERO_COPY, MMAL_TRUE);
	 if (status != MMAL_SUCCESS) {
	 printf("Failed to set zero copy on isp output!\n");
	 return;
	 }*/

	// create pool for input data
	pool_out = mmal_port_pool_create(isp_output, isp_output->buffer_num, isp_output->buffer_size);
	if (pool_out == NULL) {
		printf("Failed to create ISP output pool!\n");
	}

	// Enable ports and ISP component
	status = mmal_port_enable(isp_input, input_port_cb);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling converter input port!\n");
		return;
	}

	status = mmal_port_enable(isp_output, output_port_cb);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling converter output port!\n");
		return;
	}

	status = mmal_component_enable(pISP);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling converter!\n");
		return;
	}

	// send output buffers
	for (int i = 0; i < isp_output->buffer_num; i++) {
		MMAL_BUFFER_HEADER_T *buffer = mmal_queue_get(pool_out->queue);

		if (!buffer) {
			printf("Buffer is NULL!\n");
			exit(1);
		}
		status = mmal_port_send_buffer(isp_output, buffer);
		if (status != MMAL_SUCCESS) {
			printf("mmal_port_send_buffer failed on buffer %p, status %d\n", buffer, status);
			exit(1);
		}
	}

	printf("ISP converter init OK!\n");
}
NOTE: In your raspi_tc358743 application you are able to directly establish a connection between the RawCam ouput and ISP input for performing encoding conversion. However since I need an intermediate stage to collect the top/bottom fields and merge them into a Full HD frame I think I cannot use the tunnel connection...

Thanks again for your valuable help.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5944
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Jul 09, 2018 11:43 am

luiscgalo wrote:
Mon Jul 09, 2018 9:14 am
Your suggestion of setting a pattern on the callback buffers was the key to solve the problem of identifying packets containing image data.
I'm now able to capture raw RGB data from a 1080i50 source. :)
If someone has the same challenge of capturing interlaced video using the TC358743 on a Raspberry Pi, I can confirm that is possible with proper distinction between the top and bottom fields.
Well done.
luiscgalo wrote:My problem is related with color conversion ("vc.ril.isp") and/or usage of the deinterlacer component ("vc.ril.image_fx") - boxes in orange on the diagram.

1. My first question is related with "vc.ril.image_fx" component and the type of encoding accepted on the input.
From the few information available on the Internet I read that it accepts only "opaque" encoding (MMAL_ENCODING_OPAQUE) - but I could be wrong.
What is exactly the opaque encoding? (sorry if this is a very stupid question...)
It accepts a number of formats, including opaque.
Opaque is an internal format where the buffer only contains a reference to an existing GPU image object. That is not the case for you.
The raw pixel formats it accepts are I420, SAND (YUVUV128), I422, RGBA, and RGB565, however the deinterlace functions only work on I420 or SAND. IIRC SAND is the more efficient format for the deinterlacer to work with.
luiscgalo wrote:2. Can I directly provide BGR24 data from the raw video frame to the "vc.ril.image_fx" component input?
No.
luiscgalo wrote:3. if not, should I use the "vc.ril.isp" component as an intermediate converter from BGR24 to something accepted by the "vc.ril.image_fx"?
That's a curious one that I don't know how it works in practice. I420 and SAND are both YUV 4:2:0 formats, so the chroma is subsampled in both directions, so effectively it's subsampled across the interlacing. It works somehow, but don't know the details (What does the decoder do when decoding an interlaced signal? Does one frame have no chroma?)
luiscgalo wrote:In an trial and error attempt to convert from BGR24 to a different encoding such as I420 I've implemented a simple code that instantiates the "vc.ril.isp" component to perform the referred conversion.
However, when I run that sample code, the output callback of ISP is always invoked with buffer length set to zero (no data written).
I've tried multiple times with different combinations of parameters without a successful color conversion... :(
I'm probably missing a very simple/stupid thing here...
Since there are almost no documentation about how to use MMAL components I don't know how to proceed...
Can your expertise pointing out again what I missing here?

Here is the extract of the source code where I'm traing to perform the conversion from BGR24 to I420

Code: Select all

// This is a raw buffer for storing the Full HD interlaced video frame
uint8_t* punRGBFrame = NULL;
uint32_t unRGBFrameSize = 0;

MMAL_COMPONENT_T *pISP = NULL;
MMAL_PORT_T *isp_input = NULL, *isp_output = NULL;
MMAL_POOL_T *pool_in = NULL;
MMAL_POOL_T *pool_out = NULL;

void ConvertFrame() {
	MMAL_BUFFER_HEADER_T *buffer;
	MMAL_STATUS_T status;

	// send data to the input port of ISP component
	buffer = mmal_queue_get(pool_in->queue);
	if (buffer != NULL) {
		mmal_buffer_header_mem_lock(buffer);
		buffer->length = unRGBFrameSize;
		buffer->flags = MMAL_BUFFER_HEADER_FLAG_FRAME;
		buffer->pts = buffer->dts = MMAL_TIME_UNKNOWN;
		memcpy(buffer->data, punRGBFrame, buffer->length);
		mmal_buffer_header_mem_unlock(buffer);

		printf("sending %i bytes\n", (int) buffer->length);
		status = mmal_port_send_buffer(isp_input, buffer);
		if (status != MMAL_SUCCESS) {
			printf("Error sending data to ISP input!\n");
		}
	} else {
		printf("ISP input buffer not available!\n");
	}
}

void input_port_cb(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buffer) {
	//printf("converter input buffer! SIze=%i\n", buffer->length);
	mmal_buffer_header_release(buffer);
}

void output_port_cb(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buffer) {
	printf("ISP output buffer! Size=%i\n", buffer->length);

	/*
     Output data from converter will be saved in this section....
	 mmal_buffer_header_mem_lock(buffer);
	 mmal_buffer_header_mem_unlock(buffer);
	 */

	mmal_buffer_header_release(buffer);

	// send next output buffer
	if (port->is_enabled) {
		MMAL_BUFFER_HEADER_T *NewBuffer = mmal_queue_get(pool_out->queue);
		if (NewBuffer) {
			mmal_port_send_buffer(port, NewBuffer);
		}
	}
}

void InitConverter() {
	int32_t unInWidth = 1920;
	int32_t unInHeight = 1080;
    MMAL_STATUS_T status;

	// allocate space for RGB24 buffer
	unRGBFrameSize = unInWidth * unInHeight * 3;
	punRGBFrame = (uint8_t*) malloc(unRGBFrameSize);

	status = mmal_component_create("vc.ril.isp", &pISP);
	if (status != MMAL_SUCCESS) {
		printf("Failed to create image converter!\n");
		return;
	}

	isp_input = pISP->input[0];
	isp_output = pISP->output[0];

	isp_input->format->encoding = MMAL_ENCODING_BGR24;
	isp_input->format->es->video.width = VCOS_ALIGN_UP(unInWidth, 32);
	isp_input->format->es->video.height = VCOS_ALIGN_UP(unInHeight, 16);
	isp_input->format->es->video.crop.x = 0;
	isp_input->format->es->video.crop.y = 0;
	isp_input->format->es->video.crop.width = unInWidth;
	isp_input->format->es->video.crop.height = unInHeight;
	//isp_input->format->es->video.par.num = 1;
	//isp_input->format->es->video.par.den = 1;
	//isp_input->format->es->video.frame_rate.num = 0;
	//isp_input->format->es->video.frame_rate.den = 1;
	//isp_input->format->type = MMAL_ES_TYPE_VIDEO;
	//isp_input->format->flags = MMAL_ES_FORMAT_FLAG_FRAMED;
	//isp_input->format->extradata_size = 0;

	status = mmal_port_format_commit(isp_input);
	if (status != MMAL_SUCCESS) {
		printf("Failed to commit converter input format!\n");
		return;
	}

	isp_input->buffer_size = isp_input->buffer_size_recommended;
	isp_input->buffer_num = isp_input->buffer_num_recommended;
	printf("ISP input buffer size %i bytes\n", isp_input->buffer_size);

	// enable DMA (zero copy)
	/*status = mmal_port_parameter_set_boolean(isp_input, MMAL_PARAMETER_ZERO_COPY, MMAL_TRUE);
	 if (status != MMAL_SUCCESS) {
	 printf("Failed to set zero copy on isp input!\n");
	 return;
	 }*/

	// create pool for input data
	pool_in = mmal_port_pool_create(isp_input, isp_input->buffer_num, isp_input->buffer_size);
	if (pool_in == NULL) {
		printf("Failed to create ISP input pool!\n");
	}

	// Setup ISP output (copy of input format, changing only the encoding)
	mmal_format_copy(isp_output->format, isp_input->format);
	isp_output->format->encoding = MMAL_ENCODING_I420;
	status = mmal_port_format_commit(isp_output);
	if (status != MMAL_SUCCESS) {
		printf("Failed to commit converter output format!\n");
		return;
	}

	isp_output->buffer_size = isp_output->buffer_size_min;
	isp_output->buffer_num = isp_output->buffer_num_min;
	printf("ISP output buffer size %i bytes\n", isp_output->buffer_size);

	/*status = mmal_port_parameter_set_boolean(isp_output, MMAL_PARAMETER_ZERO_COPY, MMAL_TRUE);
	 if (status != MMAL_SUCCESS) {
	 printf("Failed to set zero copy on isp output!\n");
	 return;
	 }*/

	// create pool for input data
	pool_out = mmal_port_pool_create(isp_output, isp_output->buffer_num, isp_output->buffer_size);
	if (pool_out == NULL) {
		printf("Failed to create ISP output pool!\n");
	}

	// Enable ports and ISP component
	status = mmal_port_enable(isp_input, input_port_cb);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling converter input port!\n");
		return;
	}

	status = mmal_port_enable(isp_output, output_port_cb);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling converter output port!\n");
		return;
	}

	status = mmal_component_enable(pISP);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling converter!\n");
		return;
	}

	// send output buffers
	for (int i = 0; i < isp_output->buffer_num; i++) {
		MMAL_BUFFER_HEADER_T *buffer = mmal_queue_get(pool_out->queue);

		if (!buffer) {
			printf("Buffer is NULL!\n");
			exit(1);
		}
		status = mmal_port_send_buffer(isp_output, buffer);
		if (status != MMAL_SUCCESS) {
			printf("mmal_port_send_buffer failed on buffer %p, status %d\n", buffer, status);
			exit(1);
		}
	}

	printf("ISP converter init OK!\n");
}
I'll have to try your code and see what happens. Nothing looks wrong. Are there any flags set on the buffers? Are you setting the length correctly on the input buffer?
luiscgalo wrote:NOTE: In your raspi_tc358743 application you are able to directly establish a connection between the RawCam ouput and ISP input for performing encoding conversion. However since I need an intermediate stage to collect the top/bottom fields and merge them into a Full HD frame I think I cannot use the tunnel connection...
You certainly can't use a mmal_connection with the MMAL_CONNECTION_FLAG_TUNNELLING flag set as that keeps everything on the GPU.
Actually as you need to combine two buffers in to one, there is no way of doing it with a connection at all.

I'm not sure that it helps in your case, but there are a set of patches about to be merged that should increase the number of formats that video_encode can accept, as they integrate the ISP into the front end. I'd got 1080P50 UYVY with (I believe) no frame drops out of that, and I think with some headroom in hand (I'm hoping 1080P60 may be possible). That was with some overclocking though.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

luiscgalo
Posts: 29
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Jul 09, 2018 12:16 pm

Thanks for your quick feedback and useful clarifications.
6by9 wrote: It accepts a number of formats, including opaque.
Opaque is an internal format where the buffer only contains a reference to an existing GPU image object. That is not the case for you.
The raw pixel formats it accepts are I420, SAND (YUVUV128), I422, RGBA, and RGB565, however the deinterlace functions only work on I420 or SAND. IIRC SAND is the more efficient format for the deinterlacer to work with.
Ok, now I understood what an "opaque" buffer. ;)

So it seems that the idea of having the "vc.ril.isp" between the BGR24 interlaced frame and the "vc.ril.image_fx" deinterlacer makes sense. :)
It will be a scenario similar to the diagram of my previous post where ISP will convert from BGR24 to I420 or SAND to feed the deinterlacer component.
6by9 wrote: That's a curious one that I don't know how it works in practice. I420 and SAND are both YUV 4:2:0 formats, so the chroma is subsampled in both directions, so effectively it's subsampled across the interlacing. It works somehow, but don't know the details (What does the decoder do when decoding an interlaced signal? Does one frame have no chroma?)
Well, in my implementation, the top/bottom fields coming from the TC358743 are basically a BGR24 image with 1920x540 pixels (half vertical resolution).
When a new pair of top and bottom fields is received, then I merge the data of the fields properly (odd and even lines) into a single BGR24 buffer with 1920x1080 pixels. That's the buffer which I'm trying to convert using the ISP component.
Regarding the conversion itself between BGR/RGB24 to 4:2:0 on the RPi, I'm like you... I don't know the exact details about it :)
6by9 wrote: I'll have to try your code and see what happens. Nothing looks wrong. Are there any flags set on the buffers? Are you setting the length correctly on the input buffer?
I really need to understand why my small ISP example is not working properly.
Like I said on the previous post, the output callback is invoked when I call my "ConvertFrame()" function but the buffer length is always set to zero...

Basically, in this example when you call the "InitConverter()" function it is allocated a buffer with 1920*1080*3=6220800 bytes.
That's the buffer where I store the BGR24 FullHD interlaced image coming from the bottom/top fields merged (without any extra data - just 1920*1080 pixels, 3 bytes each).
Then, within the same function, I setup an instance of the "vc.ril.isp" component to be able to convert from BGR24 to I420.
After this you only need to call the "ConvertFrame()" function and you will see that the "output_port_cb" callback will be triggered - but with buffer length set to zero...
Theoretically this small demo code should work but...

Tonight I will make some more tries with "vc.ril.isp" component in order to check if I'm able to convert a Full HD image data from BGR24 to I420.
6by9 wrote: I'm not sure that it helps in your case, but there are a set of patches about to be merged that should increase the number of formats that video_encode can accept, as they integrate the ISP into the front end. I'd got 1080P50 UYVY with (I believe) no frame drops out of that, and I think with some headroom in hand (I'm hoping 1080P60 may be possible). That was with some overclocking though.
Thanks for the info! ;)
Yes, I think that the both ISP and H264 encoder blocks are able to work at 1080p50.
If that's needed I can overclock a little bit the GPU to achieve the target frame rate.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5944
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Jul 09, 2018 3:24 pm

luiscgalo wrote:
Mon Jul 09, 2018 12:16 pm
I really need to understand why my small ISP example is not working properly.
Like I said on the previous post, the output callback is invoked when I call my "ConvertFrame()" function but the buffer length is always set to zero...

Basically, in this example when you call the "InitConverter()" function it is allocated a buffer with 1920*1080*3=6220800 bytes.
Ah, that's probably the issue. width and height need to be multiples of 32 and 16 respectively. crop.width and crop.height denote the active pixels in the frame. (There are a couple of components that don't mandate this criteria, but this is the general rule).
1080 is not a multiple of 16, so you need to allocate buffers of 1920*1088*3 = 6266880 bytes.

I'm surprised it allowed the mmal_port_format_commit if you hadn't met that condition.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

luiscgalo
Posts: 29
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Jul 11, 2018 9:26 am

6by9, your suspicion was right.
My problem was setting a buffer length which does not corresponds to the expected data size (1920*1080*3 instead of the expected 1920*1088*3 ).

Right now I've almost all components of my "data flow diagram" already working! (blocks in green on my diagram) :)
"Rawcam" is capturing 1080i50 interlaced fields from the TC358743 and I process them to produce a FullHD interlaced video at 25fps.
The FullHD frame is then transferred to the "ISP" block which converts from BGR24 to I420.
After this step, the "ISP" has a tunnel connection with the "image_fx" block for deinterlacing the video (25i to 50p).
The chain of processing blocks is working smoothly without loosing any frames.
data-flow.jpg
data-flow.jpg (33.94 KiB) Viewed 1359 times
The last part will be the integration with the H264 encoder in to be able to record the 1080p50 video.

During my tests I had some doubts about the deinterlacer within the "image_fx" block:
1. What's the difference between the three available deinterlacing methods for the MMAL_PARAMETER_IMAGEFX_PARAMETERS_T?
MMAL_PARAM_IMAGEFX_DEINTERLACE_DOUBLE --> I'm using this since it doubles the frame rate, 25i to 50p that is my goal
MMAL_PARAM_IMAGEFX_DEINTERLACE_FAST --> if I set this I have an output framerate of 25fps (is it just a standard deinterlacing without doubling the input framerate?)
MMAL_PARAM_IMAGEFX_DEINTERLACE_ADV --> In my application, if I set this method, the "image_fx" block does not work (stays "freezed" without processing any data)

2. Since the setting referred above is the only parameter required to enable the deinterlacer, how the "image_fx" knows details such as if the input frame is interlaced as top field first or bottom field first?

3. There are any other settings for the "image_fx" block in which regards to the deinterlacing functionality? There is almost no information available on the Internet about this... :(

Once again, thanks for your support and patience 6by9.
Without your expertise, the development of my prototype project would be much more difficult... ;)

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5944
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Jul 11, 2018 9:57 am

luiscgalo wrote:
Wed Jul 11, 2018 9:26 am
6by9, your suspicion was right.
My problem was setting a buffer length which does not corresponds to the expected data size (1920*1080*3 instead of the expected 1920*1088*3 ).
Great. I like simple ones :-)
I guess it would make more sense to check for alloc_size being correct but length only needing to cover the actual data. Yet another thing to tinker with...
luiscgalo wrote:The last part will be the integration with the H264 encoder in to be able to record the 1080p50 video.

During my tests I had some doubts about the deinterlacer within the "image_fx" block:
1. What's the difference between the three available deinterlacing methods for the MMAL_PARAMETER_IMAGEFX_PARAMETERS_T?
MMAL_PARAM_IMAGEFX_DEINTERLACE_DOUBLE --> I'm using this since it doubles the frame rate, 25i to 50p that is my goal
MMAL_PARAM_IMAGEFX_DEINTERLACE_FAST --> if I set this I have an output framerate of 25fps (is it just a standard deinterlacing without doubling the input framerate?)
MMAL_PARAM_IMAGEFX_DEINTERLACE_ADV --> In my application, if I set this method, the "image_fx" block does not work (stays "freezed" without processing any data)
Double is line doubling. Takes a single frame and double the odd lines into the evens or vice versa depending on whether the frame is signalled as top or bottom field.
Fast is a more complex algorithm, but I can't find any real description of exactly what it's doing.
Advanced is using the QPUs from the 3D block to do some of the processing. This will fail if you have the vc4-kms-v3d OpenGL driver loaded as the QPUs are then controlled by Linux rather than the VPU.
luiscgalo wrote: 2. Since the setting referred above is the only parameter required for enabling the deinterlacer, how the "image_fx" knows that the input frame is interlaced as top field first or bottom field first?
Buffer flags. https://github.com/raspberrypi/userland ... fer.h#L148
MMAL_BUFFER_HEADER_VIDEO_FLAG_INTERLACED must be set or image_fx will assume the buffer is progressive.
MMAL_BUFFER_HEADER_VIDEO_FLAG_TOP_FIELD_FIRST tells it which field has been sent/received first. If MMAL_BUFFER_HEADER_VIDEO_FLAG_INTERLACED is set but MMAL_BUFFER_HEADER_VIDEO_FLAG_TOP_FIELD_FIRST isn't, then it's bottom field first.

The flags should techincally be in buffer->type->video.flags, but buffer->flags is the way it has always been used. I've only patched it in the last couple of weeks to look in buffer->type->video.flags as someone else complained that it contradicted the docs. It's in the rpi-update firmware from 7th July 2018.
You'll probably want that firmware anyway as it should have a significant improvement in video_encode performance when not using opaque sources.
luiscgalo wrote: 3. There are any other settings for the "image_fx" block in which regards to the deinterlacing functionality? There is almost no information available on the Internet about this... :(
The main people using this are the Kodi developers. dom / popcornmix is also a Pi employee and is the one typically tweaking this. Either here or the Kodi forums are the places that any details are likely to be "documented".

The only parameters that the algorithms can take are:
All algorithms support a first int is an override for the buffer flags.
0 - The data is not interlaced, it is progressive scan
1 - The data is interlaced, fields sent separately in temporal order, with upper field first
2 - The data is interlaced, fields sent separately in temporal order, with lower field first
3 - The data is interlaced, two fields sent together line interleaved, with the upper field temporally earlier
4 - The data is interlaced, two fields sent together line interleaved, with the lower field temporally earlier
5 - The stream may contain a mixture of progressive and interlaced frames (all bets are off).

Fast and advanced support extra params. Second int overrides the default frame interval. Third int halves the frame rate if you're hitting processing limits.
Advanced adds a fourth int called use_qpus. As I read the code, if this isn't set, or if the resolution exceeds SD (720x576), then it drops back to fast anyway. The logic isn't that clear.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

luiscgalo
Posts: 29
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Jul 12, 2018 9:13 am

Thanks for the clarifications about the deinterlacing mechanism using the "image_fx" component.
Finally I was able to understood the meaning of values for effect_parameter[0...3] of MMAL_PARAMETER_IMAGEFX_PARAMETERS_T :)

Yesterday I've modified my source code and now the processing chain "rawcam" -> "top/bottom fields merge" -> "ISP" -> "image_fx" is working perfectly, producing 1080p50 video. Exactly what I was looking for! ;)

I've also updated the raspberry pi firmware to the latest version (rpi-update command), following your recommendation.

Now, my final challenge is the integration with the last processing block on the chain, the H264 encoder.
Yesterday, after fine tuning the deinterlacer based on your tips, I've tried a simple tunnel connection between "image_fx" output and the H264 encoder input, saving the encoded data into a file (with very simple configuration of the H264 encoder by following some examples on the Internet).
The result is quite interesting:
1. The quality of the encoded frames is very good! It is worth to mention that the deinterlacing mechanism also does a very good job.

2. It is interesting that I'm producing video at 50fps but when reproduced on VLC player it is presented as 25fps which corresponds to half of the target speed (looks slow motion video :D ). Is this issue related with some flags of the H264 encoder on RPi or with the VLC player itself?

3. Finally, my major concern... By placing the H264 encoder on the data processing chain I have some frames being dropped. (when I try to get an available input buffer to send a new BGR24 image to the ISP block, none is available). This happens with a variable rate but with an average of about 1 frame lost in every 8 .. 10 encoded frames.
From my initial tests, I though that the H264 encoder should be capable of encoding 1920x1080p50 (the input data is in I420 format).
There is any special requirement to enable 1080p50 performance on the H264 encoder?
I've tried to overclock a little bit the GPU to 400Mhz having similar results...

Note that this problem of dropping frames only occurs if I add the H264 video encode block to the processing chain...
Probably, this performance issue is related with the some configuration of the H264 encoder. What do you think?

I'm almost there but there are always new challenges when adding new MMAL blocks.
That's the fun part of playing around with this kind of things :)

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5944
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Thu Jul 12, 2018 11:17 am

luiscgalo wrote:
Thu Jul 12, 2018 9:13 am
Thanks for the clarifications about the deinterlacing mechanism using the "image_fx" component.
Finally I was able to understood the meaning of values for effect_parameter[0...3] of MMAL_PARAMETER_IMAGEFX_PARAMETERS_T :)

Yesterday I've modified my source code and now the processing chain "rawcam" -> "top/bottom fields merge" -> "ISP" -> "image_fx" is working perfectly, producing 1080p50 video. Exactly what I was looking for! ;)

I've also updated the raspberry pi firmware to the latest version (rpi-update command), following your recommendation.

Now, my final challenge is the integration with the last processing block on the chain, the H264 encoder.
Yesterday, after fine tuning the deinterlacer based on your tips, I've tried a simple tunnel connection between "image_fx" output and the H264 encoder input, saving the encoded data into a file (with very simple configuration of the H264 encoder by following some examples on the Internet).
The result is quite interesting:
1. The quality of the encoded frames is very good! It is worth to mention that the deinterlacing mechanism also does a very good job.

2. It is interesting that I'm producing video at 50fps but when reproduced on VLC player it is presented as 25fps which corresponds to half of the target speed (looks slow motion video :D ). Is this issue related with some flags of the H264 encoder on RPi or with the VLC player itself?
You're encoding to a raw H264 elementary stream. Elementary streams have no frame rate or timestamp information in them, that is effectively metadata that gets stored alongside the frame in a container. Ideally you'd integrate libavcodec or similar into your app - it's something that I've had on my jobs list for raspivid for a while but not done.
Raspivid does support saving the timestamps to a text file, and mkvmerge can then merge that with the ES to create an MKV with correct timestamps. That'll be the quickest path for prototyping.
I always have to refer to the commit text for the syntax:

Code: Select all

commit ab71f205313cdecb10792c5d8ecd6122943a148d
Author: Ethanol100 <[email protected]>
Date:   Wed Feb 4 22:12:49 2015 +0100

    Raspivid: Add -pts to save timecodes to file
    
    This can be used with mkvmerge:
    
        raspivid -pts timecodes.txt -o video.h264
        mkvmerge -o video.mkv --timecodes 0:timecodes.txt video.h264
    
    Original commit by ethanol100. Minor mods by 6by9.
luiscgalo wrote: 3. Finally, my major concern... By placing the H264 encoder on the data processing chain I have some frames being dropped. (when I try to get an available input buffer to send a new BGR24 image to the ISP block, none is available). This happens with a variable rate but with an average of about 1 frame lost in every 8 .. 10 encoded frames.
From my initial tests, I though that the H264 encoder should be capable of encoding 1920x1080p50 (the input data is in I420 format).
There is any special requirement to enable 1080p50 performance on the H264 encoder?
I've tried to overclock a little bit the GPU to 400Mhz having similar results...

Note that this problem of dropping frames only occurs if I add the H264 video encode block to the processing chain...
Probably, this performance issue is related with the some configuration of the H264 encoder. What do you think?
The codec block is only specified for 1080P30, so anything above that will require overclocking. Turn it up as far as you can without stability issues.
I haven't got my overclocked Pi to hand to give the full settings I'm using, but you need to overclock the sdram as well as the VideoCore clocks.

You are also doubling the usage of the ISP with the RGB->I420 conversion and then as part of video encode. That should be good for about 180MPix/s, but 1920*1080*50 * 2 conversions ends up at 206MPix/s.
I know you have your pipeline working as is, but it might be worthwhile trying with UYVY from the TC358743 instead of RGB, the same merge code should work fine, and the ISP will take it happily. It reduces the memory bandwidth fairly significantly (16bpp instead of 24), so may give you that little bit more headroom.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

luiscgalo
Posts: 29
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Sat Jul 14, 2018 11:25 pm

Today I've been performing some tests with some overclocking of the gpu and SDRAM and I'm able to encode 1080p50 without loosing frames. :)
6by9 wrote: You're encoding to a raw H264 elementary stream.
Of course, you are right! There is no info regarding framerate on the H264 file. Sorry, basic fault from my side...

However, I've noticed that the movements on the recorded video are not smooth.
On the video, some times, the movements go back and forward...

After noticing this, I've make some basic tests to try to diagnosis the root cause of this issue.
At some point, I've set the "half framerate" setting of the image_fx component and the problem of back and foward movements was gone!
Of course, with this change the final format will be 1080p25, not 1080p50.

Resuming, it seems that the bootleneck of my processing toolchain is the deinterlacer (image_fx component).
Do you have any tip that could help me improving the performance of the image_fx?

By the way, I've tried two of your suggestions on previous posts:

1. Using SAND encoding on image_fx instead of I420 (you've said that it is more efficient for the deinterlacer): it was unsuccessful because, during my tests, the ISP block was not able to convert from BGR24 to SAND format (it freezes the CPU/GPU and the kernel panic LED blink is triggered)

2. Trying to capture raw video in I422 instead of BGR24: It seems a very nice idea and I've tried a lot today, without success... From what I read on TC358743 specs, there are two possible formats for sending I422 video, 12 or 8 bits via CSI port.

From my tests, 8 bits format it is the easiest one to work with but there is a "small" (big! :P) detail - I loose the capability of distinguising the bottom/top fields of the interlaced video.

Regarding the 12 bits format is quite strange and unclear for me...
I can detect the bottom/top fields but I don't know how to receive and process the received raw data.
I've tried to receive raw data on the rawcam component with MMAL_ENCODING_BAYER_SBGGR12P encoding (instead of BGR24), saving one field to a file.
However I was not able to read/decode any clear image from the received data.
Even if it is possible to receive I422 12bit from the TC358743, which format should be set as input to the ISP block?

Thanks again for your help 6by9 ;)

Return to “Camera board”