6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5372
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Jul 15, 2018 8:29 am

luiscgalo wrote:
Sat Jul 14, 2018 11:25 pm
However, I've noticed that the movements on the recorded video are not smooth.
On the video, some times, the movements go back and forward...

After noticing this, I've make some basic tests to try to diagnosis the root cause of this issue.
At some point, I've set the "half framerate" setting of the image_fx component and the problem of back and foward movements was gone!
Of course, with this change the final format will be 1080p25, not 1080p50.
Wrong field order?
luiscgalo wrote:Resuming, it seems that the bootleneck of my processing toolchain is the deinterlacer (image_fx component).
Do you have any tip that could help me improving the performance of the image_fx?

By the way, I've tried two of your suggestions on previous posts:

1. Using SAND encoding on image_fx instead of I420 (you've said that it is more efficient for the deinterlacer): it was unsuccessful because, during my tests, the ISP block was not able to convert from BGR24 to SAND format (it freezes the CPU/GPU and the kernel panic LED blink is triggered)
That sounds a touch excessive! I should have an easy setup to test this, so will do so on Monday.
luiscgalo wrote:2. Trying to capture raw video in I422 instead of BGR24: It seems a very nice idea and I've tried a lot today, without success... From what I read on TC358743 specs, there are two possible formats for sending I422 video, 12 or 8 bits via CSI port.

From my tests, 8 bits format it is the easiest one to work with but there is a "small" (big! :P) detail - I loose the capability of distinguising the bottom/top fields of the interlaced video.
Ah, just noticed in the datasheet
Limitation: YCbCr422 16bpp interlaced output video can Not be supported.
That would do it. You'd better stick to RGB24 then.
luiscgalo wrote:Regarding the 12 bits format is quite strange and unclear for me...
I can detect the bottom/top fields but I don't know how to receive and process the received raw data.
I've tried to receive raw data on the rawcam component with MMAL_ENCODING_BAYER_SBGGR12P encoding (instead of BGR24), saving one field to a file.
However I was not able to read/decode any clear image from the received data.
Even if it is possible to receive I422 12bit from the TC358743, which format should be set as input to the ISP block?
BAYER_SBGGR12P is a very odd packed format used by camera sensors. https://www.linuxtv.org/downloads/v4l-d ... gb12p.html is the easiest thing to point you at as a description, but suffice to say it is not what you want.
The YCbCr422 format is MMAL_ENCODING_UYVY, and would be on image ID 0x1E.
The YCbCr444 or YCbCr422 24bpp formats gain you nothing over RGB24, so you may as well ignore them. (In theory they might save a colour space conversion, but AIUI most HDMI sources will generate RGB rather than YCbCr data).
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

luiscgalo
Posts: 25
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Jul 16, 2018 11:06 am

6by9 wrote: Wrong field order?
Hum, I don't think so because if I set the "img_fx_param.effect_parameter[2]=1" to half the frame rate of the deinterlacer (25fps instead of 50fps) the final video is smooth, without "back and forward" issues.

Yesterday, I've performed some additional tests to exclude any potential problems related with the ISP and/or video_encode components: basically I've connected the ISP output to the video_encode input directly, bypassing the deinterlacer (rawcam -> top/bottom field merge [25fps] -> ISP -> envideo_encode).
In this first test, the output video was 108025fps and smooth - of course that, without deinterlacing, you can see the characteristic pattern when there are pan movements on the video. However, ignoring that fact, the video was smooth, without back and forward issues.

On a second attempt, in order to reproduce the real case scenario of trying to encode 1080p50, I've changed a little bit my code to trigger ISP conversions at field rate (not frame rate).
Basically, I have produced a new 1920x1080 frame every time a new interlaced field arrives (rawcam -> top/bottom field as FullHD frame [50fps] -> ISP -> envideo_encode).
The result of this test was also a success. The RPi was capable of making ISP conversions and video encoding of FullHD frames at 50fps.

My conclusion is that, for some unknown reason for me, the deinterlacer component (image_fx) is not capable of properly converting the FullHD video feed from 25i to 50p...
6by9 wrote: That sounds a touch excessive! I should have an easy setup to test this, so will do so on Monday.
Yes! :D
I don't know why but even a simple test with ISP component converting from BGR24 to SAND has that catastrophic side effect :(
6by9 wrote: That would do it. You'd better stick to RGB24 then.
I agree with you.
I think that is better to stay with BGR24 since it is well supported and the ISP block can perform successful conversions in realtime.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5372
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Jul 16, 2018 11:21 am

luiscgalo wrote:
Mon Jul 16, 2018 11:06 am
6by9 wrote: Wrong field order?
Hum, I don't think so because if I set the "img_fx_param.effect_parameter[2]=1" to half the frame rate of the deinterlacer (25fps instead of 50fps) the final video is smooth, without "back and forward" issues.
It depends on exactly which two fields it chooses to put together to make up the 25fps out.
The other possibility is that it is producing the frames out of order but with correct timestamps. That works fine when passing the frames via a scheduler to display them, but not when encoding.
luiscgalo wrote:Yesterday, I've performed some additional tests to exclude any potential problems related with the ISP and/or video_encode components: basically I've connected the ISP output to the video_encode input directly, bypassing the deinterlacer (rawcam -> top/bottom field merge [25fps] -> ISP -> envideo_encode).
In this first test, the output video was 108025fps and smooth - of course that, without deinterlacing, you can see the characteristic pattern when there are pan movements on the video. However, ignoring that fact, the video was smooth, without back and forward issues.

On a second attempt, in order to reproduce the real case scenario of trying to encode 1080p50, I've changed a little bit my code to trigger ISP conversions at field rate (not frame rate).
Basically, I have produced a new 1920x1080 frame every time a new interlaced field arrives (rawcam -> top/bottom field as FullHD frame [50fps] -> ISP -> envideo_encode).
The result of this test was also a success. The RPi was capable of making ISP conversions and video encoding of FullHD frames at 50fps.

My conclusion is that, for some unknown reason for me, the deinterlacer component (image_fx) is not capable of properly converting the FullHD video feed from 25i to 50p...
For 1080P25i I think you have to use the qpu version. XBMC/Kodi is the main user, and has code

Code: Select all

    OMX_CONFIG_IMAGEFILTERPARAMSTYPE image_filter;
    OMX_INIT_STRUCTURE(image_filter);

    image_filter.nPortIndex = m_omx_image_fx.GetOutputPort();
    image_filter.nNumParams = 4;
    image_filter.nParams[0] = 3;
    image_filter.nParams[1] = 0;
    image_filter.nParams[2] = half_framerate;
    image_filter.nParams[3] = 1; // qpu
    if (!advanced_deinterlace)
      image_filter.eImageFilter = OMX_ImageFilterDeInterlaceFast;
    else
      image_filter.eImageFilter = OMX_ImageFilterDeInterlaceAdvanced;

omx_err = m_omx_image_fx.SetConfig(OMX_IndexConfigCommonImageFilterParameters, &image_filter);
That maps directly to MMAL's MMAL_PARAMETER_IMAGE_EFFECT_PARAMETERS.
luiscgalo wrote:
6by9 wrote: That sounds a touch excessive! I should have an easy setup to test this, so will do so on Monday.
Yes! :D
I don't know why but even a simple test with ISP component converting from BGR24 to SAND has that catastrophic side effect :(
It's probably trampling over memory it shouldn't be. SAND is a funny format so it's possible that a pointer has just been missed.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5372
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Jul 16, 2018 12:42 pm

6by9 wrote:
Mon Jul 16, 2018 11:21 am
luiscgalo wrote:
6by9 wrote: That sounds a touch excessive! I should have an easy setup to test this, so will do so on Monday.
Yes! :D
I don't know why but even a simple test with ISP component converting from BGR24 to SAND has that catastrophic side effect :(
It's probably trampling over memory it shouldn't be. SAND is a funny format so it's possible that a pointer has just been missed.
OK, so my quickest test (yavta) is actually UYVY to SAND, and that is working fine going from ISP to video_render and video_encode. That is against my current development firmware rather than the released one.
Input and output are pretty much disassociated, therefore seeing as you've had RGB in working fine, I'm fairly happy that the current code is OK.
Can you share your setup code? Otherwise I could be hunting for a fair while.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

luiscgalo
Posts: 25
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Jul 16, 2018 2:06 pm

Sure, I can share my code with you.
Currently I'm at workplace and I have no access to my latest version of the source code on my "cloud" share.
However, I have some code from last week where the BGR24 to I420 conversion is made, including the deinterlacing. (top/bottom field merge -> ISP -> image_fx)
Basically, the only part missing here is the last connection with the video_encode component.

Code: Select all

#include "converter.h"

#include "interface/mmal/mmal.h"
#include "interface/mmal/mmal_buffer.h"
#include "interface/mmal/util/mmal_util.h"
#include "interface/mmal/util/mmal_util_params.h"
#include "interface/mmal/util/mmal_connection.h"
#include "interface/mmal/util/mmal_default_components.h"

uint8_t* punRGBFrame = NULL;
uint32_t unRGBFrameSize = 0;

MMAL_COMPONENT_T *pISP = NULL;
MMAL_PORT_T *isp_input = NULL;
MMAL_PORT_T *isp_output = NULL;
MMAL_POOL_T *pool_in = NULL;

MMAL_COMPONENT_T *pImageFx = NULL;
MMAL_PORT_T *deint_input = NULL;
MMAL_PORT_T *deint_output = NULL;
MMAL_POOL_T *pool_out = NULL;

MMAL_CONNECTION_T* conn_isp_deint = NULL;

void ConvertFrame() {
	MMAL_BUFFER_HEADER_T *buffer;
	MMAL_STATUS_T status;

	// send data to the input port of ISP component
	buffer = mmal_queue_get(pool_in->queue);
	if (buffer != NULL) {
		mmal_buffer_header_mem_lock(buffer);
		buffer->flags |= MMAL_BUFFER_HEADER_VIDEO_FLAG_INTERLACED | MMAL_BUFFER_HEADER_VIDEO_FLAG_TOP_FIELD_FIRST;
		buffer->length = unRGBFrameSize;  //buffer->alloc_size; 
		buffer->pts = buffer->dts = MMAL_TIME_UNKNOWN;
		memcpy(buffer->data, punRGBFrame, unRGBFrameSize);
		mmal_buffer_header_mem_unlock(buffer);

		printf("Sending frame to ISP input (%d bytes)\n", buffer->length);
		status = mmal_port_send_buffer(isp_input, buffer);
		if (status != MMAL_SUCCESS) {
			printf("Error sending data to ISP input!\n");
		}
	} else {
		printf("\t\tISP input buffer not available!\n");
	}
}

void ConvertVideoField(SImageData sField) {
	uint16_t i, unFieldOffset;
	uint32_t unSourcePos, unDestPos;

	uint8_t* punSourceData = sField.ptrData;
	const uint32_t unWidthStep = 1920 * 3;

	if (sField.unField == FIELD_TOP) {
		//printf("Top field...\n");
		unFieldOffset = 0;
	} else {
		//printf("Bottom field...\n");
		unFieldOffset = 1;
	}

    // merge individual field into odd/even raws of the frame
	unDestPos = 0;
	unSourcePos = 0;
	for (i = 0; i < 540; i++) {
		unSourcePos = i * unWidthStep;
		unDestPos = (i * 2 + unFieldOffset) * unWidthStep;
		memcpy(&punRGBFrame[unDestPos], &punSourceData[unSourcePos], unWidthStep);
	}

	if (sField.unField == FIELD_BOTTOM) {
		ConvertFrame();
	}
}

void input_port_cb(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buffer) {
	mmal_buffer_header_release(buffer);
}

FILE * pFile = NULL;
uint16_t nCnt = 0;

void output_port_cb(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buffer) {
	printf("image_fx output buffer! Size=%i\n", buffer->length);

	mmal_buffer_header_mem_lock(buffer);
    // save one raw I420 frame
	if ((pFile == NULL) && (nCnt==150)) {
		pFile = fopen("out.i420", "wb");
		printf("Writing frame! Bytes=%i\n", buffer->length);
		fwrite(buffer->data, 1, buffer->length, pFile);
		fclose(pFile);
	}
	nCnt++;
	mmal_buffer_header_mem_unlock(buffer);

	mmal_buffer_header_release(buffer);

	// send next output buffer
	if (port->is_enabled) {
		MMAL_BUFFER_HEADER_T *NewBuffer = mmal_queue_get(pool_out->queue);
		if (NewBuffer) {
			mmal_port_send_buffer(port, NewBuffer);
		}
	}
}

void InitConverter() {
	int32_t unInWidth = 1920;
	int32_t unInHeight = 1080;
    MMAL_STATUS_T status;

	// allocate space for RGB24 buffer
	unRGBFrameSize = VCOS_ALIGN_UP(unInWidth, 32) * VCOS_ALIGN_UP(unInHeight, 16) * 3;
	punRGBFrame = (uint8_t*) malloc(unRGBFrameSize);

	// ###################################### ISP converter ##################################################

	status = mmal_component_create("vc.ril.isp", &pISP);
	if (status != MMAL_SUCCESS) {
		printf("Failed to create ISP!\n");
		return;
	}

	isp_input = pISP->input[0];
	isp_output = pISP->output[0];

	isp_input->format->encoding = MMAL_ENCODING_BGR24;
	isp_input->format->es->video.width = VCOS_ALIGN_UP(unInWidth, 32);
	isp_input->format->es->video.height = VCOS_ALIGN_UP(unInHeight, 16);
	isp_input->format->es->video.crop.x = 0;
	isp_input->format->es->video.crop.y = 0;
	isp_input->format->es->video.crop.width = unInWidth;
	isp_input->format->es->video.crop.height = unInHeight;
	isp_input->format->es->video.frame_rate.num = 0;
	isp_input->format->es->video.frame_rate.den = 1;
	status = mmal_port_format_commit(isp_input);
	if (status != MMAL_SUCCESS) {
		printf("Failed to commit converter input format!\n");
		return;
	}

	isp_input->buffer_size = isp_input->buffer_size_recommended;
	isp_input->buffer_num = isp_input->buffer_num_recommended;
	printf("ISP input buffer size %i bytes\n", isp_input->buffer_size);

	// create pool for input data
	pool_in = mmal_port_pool_create(isp_input, isp_input->buffer_num, isp_input->buffer_size);
	if (pool_in == NULL) {
		printf("Failed to create ISP input pool!\n");
	}

	// Setup ISP output (copy of input format, changing only the output encoding)
	mmal_format_copy(isp_output->format, isp_input->format);
	isp_output->format->encoding = MMAL_ENCODING_I420;
    //isp_output->format->encoding = MMAL_ENCODING_YUVUV128;  /// ---> SAND format
	status = mmal_port_format_commit(isp_output);
	if (status != MMAL_SUCCESS) {
		printf("Failed to commit converter output format!\n");
		return;
	}

	isp_output->buffer_size = isp_output->buffer_size_recommended;
	isp_output->buffer_num = isp_output->buffer_num_recommended;

	// Enable ISP input port
	status = mmal_port_enable(isp_input, input_port_cb);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling converter input port!\n");
		return;
	}

	// ################################## deinterlacer ######################################################
	status = mmal_component_create("vc.ril.image_fx", &pImageFx);
	if (status != MMAL_SUCCESS) {
		printf("Failed to create image_fx!\n");
		return;
	}

	deint_input = pImageFx->input[0];
	deint_output = pImageFx->output[0];

	MMAL_PARAMETER_IMAGEFX_PARAMETERS_T img_fx_param;
	memset(&img_fx_param, 0, sizeof(MMAL_PARAMETER_IMAGEFX_PARAMETERS_T));
	img_fx_param.hdr.id = MMAL_PARAMETER_IMAGE_EFFECT_PARAMETERS;
	img_fx_param.hdr.size = sizeof(MMAL_PARAMETER_IMAGEFX_PARAMETERS_T);
	img_fx_param.effect = MMAL_PARAM_IMAGEFX_DEINTERLACE_FAST;

	img_fx_param.num_effect_params = 4;
	img_fx_param.effect_parameter[0] = 3; // interlaced data / two fields sent together / TFF
	img_fx_param.effect_parameter[1] = 0; // frame period (1000000 / fps);
	img_fx_param.effect_parameter[2] = 0; // half frame rate
	img_fx_param.effect_parameter[3] = 0; // use QPU? - not used, only for advanced deinterlacing

	if (mmal_port_parameter_set(deint_output, &img_fx_param.hdr) != MMAL_SUCCESS) {
		printf("Failed to configure deinterlacer output port (mode)\n");
		return;
	}

	// Setup image_fx input format
	mmal_format_copy(deint_input->format, isp_output->format);
	status = mmal_port_format_commit(deint_input);
	if (status != MMAL_SUCCESS) {
		printf("Failed to commit image_fx input format!\n");
		return;
	}

	// Setup image_fx output format
	mmal_format_copy(deint_output->format, deint_input->format);
	status = mmal_port_format_commit(deint_output);
	if (status != MMAL_SUCCESS) {
		printf("Failed to commit image_fx output format!\n");
		return;
	}

	deint_output->buffer_size = deint_output->buffer_size_recommended;
	deint_output->buffer_num = deint_output->buffer_num_recommended;

	// create pool for deinterlacer output data
	pool_out = mmal_port_pool_create(deint_output, deint_output->buffer_num, deint_output->buffer_size);
	if (pool_out == NULL) {
		printf("Failed to create deinterlacer output pool!\n");
	}

	// Enable output port
	status = mmal_port_enable(deint_output, output_port_cb);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling deinterlacer output port!\n");
		return;
	}

	printf("Create connection ISP output to image_fx input...\n");
	status = mmal_connection_create(&conn_isp_deint, isp_output, deint_input, MMAL_CONNECTION_FLAG_TUNNELLING);
	if (status != MMAL_SUCCESS) {
		printf("Failed to create connection status %d: ISP->image_fx\n", status);
		return;
	}

	status = mmal_connection_enable(conn_isp_deint);
	if (status != MMAL_SUCCESS) {
		printf("Failed to enable connection ISP->image_fx\n");
		return;
	}

	// ###################################### generic initialization ##################################################

	status = mmal_component_enable(pISP);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling converter!\n");
		return;
	}

	status = mmal_component_enable(pImageFx);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling converter!\n");
		return;
	}

	// send output buffers
	for (uint8_t i = 0; i < deint_output->buffer_num; i++) {
		MMAL_BUFFER_HEADER_T *buffer = mmal_queue_get(pool_out->queue);

		if (!buffer) {
			printf("Buffer is NULL!\n");
			exit(1);
		}
		status = mmal_port_send_buffer(deint_output, buffer);
		if (status != MMAL_SUCCESS) {
			printf("mmal_port_send_buffer failed on buffer %p, status %d\n", buffer, status);
			exit(1);
		}
	}

	printf("ISP converter init OK!\n");
}

void CloseConverter() {
	MMAL_STATUS_T status;

	printf("Closing converter...\n");

	// disable and destroy connection between ISP and deinterlacer
	mmal_connection_disable(conn_isp_deint);
	mmal_connection_destroy(conn_isp_deint);

	// disable ports
	mmal_port_disable(isp_input);
	mmal_port_disable(deint_output);

	// disable ISP
	status = mmal_component_disable(pISP);
	if (status != MMAL_SUCCESS) {
		printf("Failed to disable ISP component!\n");
	}

	// disable deinterlacer
	status = mmal_component_disable(pImageFx);
	if (status != MMAL_SUCCESS) {
		printf("Failed to disable ISP component!\n");
	}

	// destroy components
	mmal_component_destroy(pISP);
	mmal_component_destroy(pImageFx);
}
This source code belongs to my "converter.c" source file.
During initialization you should call the "InitConverter"
When the rawcam component is working (that source code is in another file), each received top/bottom field triggers the "ConvertVideoField" function.
The SImageData argument is just a custom struct for notifying the buffer, length and field type

Code: Select all

typedef struct {
	uint8_t* ptrData;
	uint32_t unDataLength;
	uint8_t unField;
} SImageData;
Then the output I420 deinterlaced image data is available when the "output_port_cb" callback is invoked.

If I'm not missing any detail, the only major difference to my latest version of the code is the connection with the video_encode block.
If you want, tonight I can share with you my latest version of the code. :)

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5372
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Jul 16, 2018 2:28 pm

OK, that looks reasonable. It's possible that it's resolution dependent. I'll have to look another time.

FYI, my overclocking settings are:

Code: Select all

over_voltage=4
gpu_freq=480
sdram_freq=550
sdram_schmoo=0x02000020
over_voltage_sdram_p=6
over_voltage_sdram_i=4
over_voltage_sdram_c=4
force_turbo=1
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

grimepoch
Posts: 82
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 7:48 pm

Whew, after more boards and time than I'd like to admit, I have raspiraw receiving data consistently from an ADV7282A-M chip. I modified the register settings being transferred to do a NTSC 480p test pattern. For reference, here is what I used:

Code: Select all

:Color Bars 480p MIPI Out:
delay 10 ;
42 0F 00 ; Exit power down mode
42 00 04 ; ADI Required Write 
42 0C 37 ; Force Free-run mode 
42 02 54 ; Force standard to NTSC-M 
42 14 11 ; Set Free-run pattern to color bars
42 03 4E ; ADI Required Write
42 04 57 ; Enable Intrq pin
42 13 00 ; Enable INTRQ output driver
42 17 41 ; select SH1
42 1D C0 ; Tri-State LLC output driver
42 52 CB ; ADI Required Write
42 80 51 ; ADI Required Write
42 81 51 ; ADI Required Write
42 82 68 ; ADI Required Write
42 5D 3C ; Enable Diagnostic pin 1 - Level=1.125V
42 5E 3C ; Enable Diagnostic pin 2 - Level=1.125V
42 FD 84 ; Set VPP Map Address
84 A3 00 ; ADI Required Write
84 5B 00 ; Advanced Timing Enabled
84 55 80 ; Enable I2P
42 FE 88 ; Set CSI Map Address
88 01 20 ; ADI Required Write
88 02 28 ; ADI Required Write
88 03 38 ; ADI Required Write
88 04 30 ; ADI Required Write
88 05 30 ; ADI Required Write
88 06 80 ; ADI Required Write
88 07 70 ; ADI Required Write
88 08 50 ; ADI Required Write
88 DE 02 ; Power up D-PHY
88 D2 F7 ; ADI Required Write
88 D8 65 ; ADI Required Write
88 E0 09 ; ADI Required Write
88 2C 00 ; ADI Required Write
88 1D 80 ; ADI Required Write
88 00 00 ; Power up MIPI CSI-2 Tx --done--
End
I tried using dcraw to translate the output, but it just says it cannot decode them. Wondering what I could possibly be doing wrong.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5372
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 8:21 pm

grimepoch wrote:
Fri Jul 20, 2018 7:48 pm
Whew, after more boards and time than I'd like to admit, I have raspiraw receiving data consistently from an ADV7282A-M chip. I modified the register settings being transferred to do a NTSC 480p test pattern. For reference, here is what I used:

Code: Select all

:Color Bars 480p MIPI Out:
delay 10 ;
42 0F 00 ; Exit power down mode
42 00 04 ; ADI Required Write 
42 0C 37 ; Force Free-run mode 
42 02 54 ; Force standard to NTSC-M 
42 14 11 ; Set Free-run pattern to color bars
42 03 4E ; ADI Required Write
42 04 57 ; Enable Intrq pin
42 13 00 ; Enable INTRQ output driver
42 17 41 ; select SH1
42 1D C0 ; Tri-State LLC output driver
42 52 CB ; ADI Required Write
42 80 51 ; ADI Required Write
42 81 51 ; ADI Required Write
42 82 68 ; ADI Required Write
42 5D 3C ; Enable Diagnostic pin 1 - Level=1.125V
42 5E 3C ; Enable Diagnostic pin 2 - Level=1.125V
42 FD 84 ; Set VPP Map Address
84 A3 00 ; ADI Required Write
84 5B 00 ; Advanced Timing Enabled
84 55 80 ; Enable I2P
42 FE 88 ; Set CSI Map Address
88 01 20 ; ADI Required Write
88 02 28 ; ADI Required Write
88 03 38 ; ADI Required Write
88 04 30 ; ADI Required Write
88 05 30 ; ADI Required Write
88 06 80 ; ADI Required Write
88 07 70 ; ADI Required Write
88 08 50 ; ADI Required Write
88 DE 02 ; Power up D-PHY
88 D2 F7 ; ADI Required Write
88 D8 65 ; ADI Required Write
88 E0 09 ; ADI Required Write
88 2C 00 ; ADI Required Write
88 1D 80 ; ADI Required Write
88 00 00 ; Power up MIPI CSI-2 Tx --done--
End
Can I push you in the direction of the recently merged V4L2 driver and overlay for the ADV7282-M?

Code: Select all

dtoverlay=adv7282m
should do it on a Pi 3 or 3B+

Code: Select all

dtoverlay=adv7282m,i2c_pins_28_29=1
should do it on a Pi 0, 0W, A+, B+, or Pi2.
Sorry, I haven't the necessary device tree bits for an original A or B as it gets a touch complicated.
You should get /dev/video0. Setting the colour format ("v4l2-ctl -s pal-b" or "v4l2-ctl -s ntsc") will select the standard and sets the resolution (720x576 or 720x480).
As a basic app to exercise it, https://github.com/6by9/yavta takes one of the standard V4L2 test apps and adds in some of the Pi video acceleration via MMAL.

Code: Select all

./yavta --capture=1000 -n 3 --encode-to=file.h264 -f UYVY -m -T /dev/video0
should select the auto-detected video standard, and then display the video and encode it to file.h264.

I have it set up and working on my desk, and have tested it with CVBS, S-Video, and Component input, although admittedly always PAL input. The one quirk is that V4L2 is missing a sensible input selection mechanism. The default is CVBS on AIN1. If you do "echo 8 > /sys/module/adv7180/parameters/dbg_input" would select S-Video on AIN1 & 2. For other options look at the driver source
I'm also aware of one other who has made up a dev board using ADV7280A-M and has it working through this driver.
grimepoch wrote:I tried using dcraw to translate the output, but it just says it cannot decode them. Wondering what I could possibly be doing wrong.
dcraw is designed around Bayer sensors. The ADV7282 is producing YUV422 as UYVY. The conversion from YUV to RGB is relatively simple mathematically with no need for demosaicing, sharpening, white balance compensation, lens shading. Dcraw is totally the wrong tool.
If you're using an x86 Linux machine then I can recommend Vooya for viewing YUV or RGB images. It's available for Windows and MacOS too, but I haven't tried it, and they charge a small amount for it. Sadly it's not avaiable for ARM Linux.
Last edited by 6by9 on Fri Jul 20, 2018 8:28 pm, edited 1 time in total.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

grimepoch
Posts: 82
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 8:26 pm

I don't mind going in that direction, if it can handle an interlaced input signal as a next step. Right now I am just using progressive to test, using the internal line doubler, but the end use case is interlaced NTSC and PAL input.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5372
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 8:39 pm

grimepoch wrote:
Fri Jul 20, 2018 8:26 pm
I don't mind going in that direction, if it can handle an interlaced input signal as a next step. Right now I am just using progressive to test, using the internal line doubler, but the end use case is interlaced NTSC and PAL input.
No, I am deliberately ignoring interlaced - it's a pain on many devices, and I'm not sure it's an issue that can be solved on the ADV7282M.

With interlaced video you need to be able to identify whether the frame you've just received is the top or bottom field. Over the BT656 interface there is an F line. Over CSI-2 there's no such thing. I've seen some devices that alter the CSI-2 data_type field, some send as signal in the line number field of the line start/end short packets, and some seem to ignore the problem. The ADV7282 appears to fall into that last category as I can see no reference in the datasheet to suitable signalling. I haven't experimented myself to see if there is any difference.

What have you got against the internal I2P block?
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5372
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 8:49 pm

It may just work, but I'm not convinced.
The adv7180 driver advertises that it does "V4L2_FIELD_INTERLACED", for which the definition is "Images contain both fields, interleaved line by line. The temporal order of the fields (whether the top or bottom field is first transmitted) depends on the current video standard. M/NTSC transmits the bottom field first, all other standards the top field first."
Then again, unless the ADV7282 is buffering the whole of the first field (very unlikely) it has no way of producing such a line interlaced frame.

Currently the Unicam (CSI2 receiver) V4L2 driver deliberately selects a non-interlaced format.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

grimepoch
Posts: 82
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 8:59 pm

The biggest problem I have with the I2P block is that it just interpolates the current half frame into a full frame. I'd rather get 2 half frames and stitch them.

This is from an engineer that supports the ADV7282M

"in interlaced mode odd frames have one extra line than even frames."

I tried doing an 'rpi-update' to get the latest updated, however, that seems to break my PCM drivers. Oops. :) (And loading the overlay did not show the /dev/video yet).

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5372
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 9:09 pm

grimepoch wrote:
Fri Jul 20, 2018 8:59 pm
This is from an engineer that supports the ADV7282M

"in interlaced mode odd frames have one extra line than even frames."
Your issue is going to be detecting that.
rawcam (as used by raspiraw) will always tell you that the buffer was filled as it got a frame end interrupt.
Similarly the V4L2 driver will give you a struct v4l2_buffer with a bytesused field set to the frame size.
The peripheral does have a write pointer register, but that is generally only used for debug.

I think your only hope is to write a magic value to the start of the last line. If your buffer comes back with that in tact, then it was the short field, otherwise it is the long one. It's pretty grotty, but the only solution I can think of off the top of my head.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

grimepoch
Posts: 82
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 9:30 pm

Would that work for the V4L2 driver, or just the raw cam method? Makes sense what you are suggesting. Since this is an embedded system, I am fine with using a magic key.

For now I am reconstructing/converting the frame inside a shader, so I have quite a bit of control over what is happening on that part of the process.

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5372
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 9:41 pm

grimepoch wrote:
Fri Jul 20, 2018 9:30 pm
Would that work for the V4L2 driver, or just the raw cam method? Makes sense what you are suggesting. Since this is an embedded system, I am fine with using a magic key.
Apart from the V4L2 driver actively blocking you from selecting the interlaced mode, then it should work in either.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

grimepoch
Posts: 82
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 9:54 pm

Not sure I understand. If the V4L2 blocks me from selecting interlaced mode, how would I select that? Or, would I be using it with it thinking I am doing progressive and just handle the frames myself that get returned as half frames?

Also not sure why I don't see /dev/video with the latest 4.14.56-v7+ build. Should I have used something different to pull the latest V4L2 changes? (I am using a Compute 3 lite module btw).

grimepoch
Posts: 82
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 9:56 pm

This is what is see in dmesg:

[ 4.541221] adv7180 0-0021: chip found @ 0x21 (bcm2835 I2C adapter)
[ 4.590683] adv7180: probe of 0-0021 failed with error -5

This is what I added in my config.txt:

# Test for video in
dtoverlay=adv7282m

grimepoch
Posts: 82
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 10:17 pm

While I cannot find any reference for what -5 is, I am thinking this is a problem with pins 0 and 1 on the Pi. The come up active, as part of I2C by default. In my code, I am able to disable them before I use the I2C bus. But when this comes up, they are likely on.

So I need to figure out how to shut those off BEFORE it attempts to communicate with the V4L2 driver.

grimepoch
Posts: 82
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Fri Jul 20, 2018 10:50 pm

Confirmed. Setting the pin settings in the config.txt before defining the overlay gets video0 showing!

Still unclear on the using it interlaced, if there is a way for me to force the V4L2 driver to allow that and have me decode the frames as we discussed.

I guess if I sent it commands like how you recommend switching the input (which I need to do as well as I am using multiple input configurations) might work.

grimepoch
Posts: 82
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Sat Jul 21, 2018 9:00 pm

Works! I am able to use all the interfaces I need. What I don't understand is I am not doing anything special and the interlaced signal looks FANTASTIC. It doesn't look like it is using the I2P, yet I have no idea how it is working given your thoughts on the topic.

grimepoch
Posts: 82
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Jul 22, 2018 4:11 am

After much further digging, it is in fact the I2P processor. Once I had something with horizontal lines coming in, you could see the problem. This destroyed text rendering of course.

Using i2cset while it was running, I turned off the I2P which then just filled half frames while displaying with mmal.

Not sure what the proper way to make changes to the settings. Right now working on extracting the frame data to take a look at, so I can look at injecting my own special key on that extra line. (or finding a better way to do it).

6by9
Raspberry Pi Engineer & Forum Moderator
Raspberry Pi Engineer & Forum Moderator
Posts: 5372
Joined: Wed Dec 04, 2013 11:27 am
Location: ZZ9 Plural Z Alpha, aka just outside Cambridge.

Re: Raw sensor access / CSI-2 receiver peripheral

Sun Jul 22, 2018 8:37 pm

CSI2 receiver changes required at https://github.com/raspberrypi/linux/bl ... cam.c#L749

If the driver could determine which field is which, then the struct v4l2_buffer should be returned with the field field being set appropriately. As I've said, in this case we can't, so the driver just copies the set field mode into the buffer.

Within yavta, ioctl(dev->fd, VIDIOC_DQBUF, &buf); dequeues the buffer from V4L2 before passing it to MMAL. VIDIOC_QBUF is the partner function that is returning a buffer to V4L2. So that gives you your two points to try checking and inserting magic sequences.

If I've read the code correctly, then dev->buffers[buf->index].mem[0] should be an mmap of the buffer so you don't need to do too much work. dev->plane_fmt[0].bytesperline should be the stride for each line, so you want to be looking at ((uint8_t*)dev->buffers[buf->index].mem[0])[dev->plane_fmt[0].bytesperline * y + x*2] should allow you to find any point in the image.
Software Engineer at Raspberry Pi Trading. Views expressed are still personal views.
Please don't send PMs asking for support - use the forum.
I'm not interested in doing contracts for bespoke functionality - please don't ask.

grimepoch
Posts: 82
Joined: Thu May 12, 2016 1:57 am

Re: Raw sensor access / CSI-2 receiver peripheral

Mon Jul 23, 2018 6:52 pm

Whew, okay, I'll look into what you've described, makes sense.

One thing I am trying to find more reference on is using MMAL. In this case MMAL is being used to render to the screen correct? I guess what I need to dig deeper on is how I want to get this data to the GPU for GLSL use. I have some USB webcam code I used that I believe also used MMAL so I will examine how that behaved as well and see what they did. I believe right now it uses glTexImage2D to stuff the image data into a texture.

Do you know of a good reference source that really talks about MMAL?

luiscgalo
Posts: 25
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Jul 25, 2018 8:39 am

Hi all,
After some days of inactivity on my project, I finally had some time to continue investigating the issue with the deinterlacer.

I've performed multiple tests and I think that the problem is indeed related with the deinterlacer, misuse and/or incorrect configuration on my side...

Basically, if I set the "img_fx_param.effect_parameter[2] = 1" to half the output framerate (25fps instead of 50fps), the output video is nice and smooth, but at 25fps of course.
When I try to output 50fps, the output is almost ok... It seems that the deinterlacer/encoder chain struggles some times and cyclically the video is not smooth.

Better than explaining my problem using words is to provide you some example videos captured by my application.
I've created an WeTransfer link where you are able to download two H264 files, one for the 25fps test (nice and smooth) and a second one for the 50fps test (where you can see the issue that I'm describing)
https://we.tl/wArhB50GGs

The source code that I'm using to capture the files is the following (my latest version, excluding the rawcam and TC358743 parts since they are already working fine)

Code: Select all

#include "converter.h"

#include <stdio.h>
#include "interface/mmal/mmal.h"
#include "interface/mmal/mmal_buffer.h"
#include "interface/mmal/util/mmal_util.h"
#include "interface/mmal/util/mmal_util_params.h"
#include "interface/mmal/util/mmal_connection.h"
#include "interface/mmal/util/mmal_default_components.h"

/**
 * Flow of image processing: RawFrame -> ISP -> ImageFx -> Encoder
 */

// ISP variables (converter from BGR24 to I420)
MMAL_COMPONENT_T *pISP = NULL;
MMAL_PORT_T *isp_input = NULL;
MMAL_PORT_T *isp_output = NULL;
MMAL_POOL_T *isp_pool_in = NULL;

// Image Fx variables (deinterlacer)
MMAL_COMPONENT_T *pImageFx = NULL;
MMAL_PORT_T *deint_input = NULL;
MMAL_PORT_T *deint_output = NULL;

// H264 Encoder variables
MMAL_COMPONENT_T *pEncoder = NULL;
MMAL_PORT_T *enc_input = NULL;
MMAL_PORT_T *enc_output = NULL;
MMAL_POOL_T *enc_pool_out = NULL;

// Connections between components
MMAL_CONNECTION_T* conn_isp_deint = NULL;
MMAL_CONNECTION_T* conn_deint_enc = NULL;

// hard-coded constants for test purposes only
#define VIDEO_WIDTH 	1920
#define VIDEO_HEIGHT	1080

int64_t pts = 0;
FILE * pFile2 = NULL;

/**
 * Function called when a new raw (BGR24) interlaced frame is ready to be converted
 */
void ConvertFrame(uint8_t* punBuffer, const uint32_t unBufferSize) {
	MMAL_BUFFER_HEADER_T *buffer;

	// send data to the input port of ISP component
	buffer = mmal_queue_get(isp_pool_in->queue);
	if (buffer != NULL) {
		mmal_buffer_header_mem_lock(buffer);
		buffer->flags = MMAL_BUFFER_HEADER_VIDEO_FLAG_INTERLACED | MMAL_BUFFER_HEADER_VIDEO_FLAG_TOP_FIELD_FIRST;
		buffer->length = unBufferSize;

		//buffer->pts = buffer->dts = MMAL_TIME_UNKNOWN;
		buffer->pts = buffer->dts = pts;
		pts += 1000000 / 50;

		memcpy(buffer->data, punBuffer, unBufferSize);
		mmal_buffer_header_mem_unlock(buffer);

		//printf("BGR24 frame -> ISP input (%d bytes)\n", buffer->length);
		if (mmal_port_send_buffer(isp_input, buffer) != MMAL_SUCCESS) {
			printf("Error sending data to ISP input!\n");
		}
	} else {
		printf("\t\tISP input buffer not available!\n");
	}
}

/*
 * ISP input port callback function
 */
void isp_input_cb(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buffer) {
	mmal_buffer_header_release(buffer);
}

/*
 * H264 encoder output port callback function
 */
void enc_output_callback(MMAL_PORT_T *port, MMAL_BUFFER_HEADER_T *buffer) {
	printf("Encoder out data (%d bytes)!\n", buffer->length);

	if (pFile2 == NULL) {
		pFile2 = fopen("capture.h264", "wb");
	}

	// These are the header bytes, save them for final output
	mmal_buffer_header_mem_lock(buffer);
	fwrite(buffer->data, 1, buffer->length, pFile2);
	mmal_buffer_header_mem_unlock(buffer);

	// release buffer back to the pool
	mmal_buffer_header_release(buffer);

	// and send one back to the port (if still open)
	if (port->is_enabled) {
		MMAL_STATUS_T status;
		MMAL_BUFFER_HEADER_T* new_buffer = mmal_queue_get(enc_pool_out->queue);

		if (new_buffer)
			status = mmal_port_send_buffer(port, new_buffer);

		if (!new_buffer || status != MMAL_SUCCESS)
			printf("Unable to return a buffer to the encoder port!\n");
	}
}

/*
 * Converter/encoder initialization function
 */
void InitConverter() {
	MMAL_STATUS_T status;

	// ###################################### ISP converter ##################################################

	status = mmal_component_create("vc.ril.isp", &pISP);
	if (status != MMAL_SUCCESS) {
		printf("Failed to create ISP!\n");
		return;
	}

	isp_input = pISP->input[0];
	isp_output = pISP->output[0];

	isp_input->format->type = MMAL_ES_TYPE_VIDEO;
	isp_input->format->encoding = MMAL_ENCODING_BGR24;
	isp_input->format->es->video.width = VCOS_ALIGN_UP(VIDEO_WIDTH, 32);
	isp_input->format->es->video.height = VCOS_ALIGN_UP(VIDEO_HEIGHT, 16);
	isp_input->format->es->video.crop.x = 0;
	isp_input->format->es->video.crop.y = 0;
	isp_input->format->es->video.crop.width = VIDEO_WIDTH;
	isp_input->format->es->video.crop.height = VIDEO_HEIGHT;
	isp_input->format->es->video.frame_rate.num = 0;
	isp_input->format->es->video.frame_rate.den = 1;
	status = mmal_port_format_commit(isp_input);
	if (status != MMAL_SUCCESS) {
		printf("Failed to commit converter input format!\n");
		return;
	}

	isp_input->buffer_size = isp_input->buffer_size_recommended;
	isp_input->buffer_num = isp_input->buffer_num_recommended;
	//printf("ISP input buffer size %i bytes\n", isp_input->buffer_size);

	// create pool for input data
	isp_pool_in = mmal_port_pool_create(isp_input, isp_input->buffer_num, isp_input->buffer_size);
	if (isp_pool_in == NULL) {
		printf("Failed to create ISP input pool!\n");
	}

	// Setup ISP output (copy of input format, changing only the encoding)
	mmal_format_copy(isp_output->format, isp_input->format);
	isp_output->format->encoding = MMAL_ENCODING_I420;
	status = mmal_port_format_commit(isp_output);
	if (status != MMAL_SUCCESS) {
		printf("Failed to commit converter output format!\n");
		return;
	}

	isp_output->buffer_size = isp_output->buffer_size_recommended;
	isp_output->buffer_num = isp_output->buffer_num_recommended;
	//printf("ISP output buffer size %i bytes\n", isp_output->buffer_size);

	printf("ISP buffers In=%i | Out=%i\n", isp_input->buffer_num, isp_output->buffer_num);

	// Enable ports and ISP component
	status = mmal_port_enable(isp_input, isp_input_cb);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling ISP input port!\n");
		return;
	}

	// ################################## deinterlacer ######################################################

	status = mmal_component_create("vc.ril.image_fx", &pImageFx);
	if (status != MMAL_SUCCESS) {
		printf("Failed to create image_fx!\n");
		return;
	}

	deint_input = pImageFx->input[0];
	deint_output = pImageFx->output[0];

	MMAL_PARAMETER_IMAGEFX_PARAMETERS_T img_fx_param;
	memset(&img_fx_param, 0, sizeof(MMAL_PARAMETER_IMAGEFX_PARAMETERS_T));
	img_fx_param.hdr.id = MMAL_PARAMETER_IMAGE_EFFECT_PARAMETERS;
	img_fx_param.hdr.size = sizeof(MMAL_PARAMETER_IMAGEFX_PARAMETERS_T);
	img_fx_param.effect = MMAL_PARAM_IMAGEFX_DEINTERLACE_FAST;
	img_fx_param.num_effect_params = 3;
	img_fx_param.effect_parameter[0] = 3; // interlaced input frame with both fields / top field first
	img_fx_param.effect_parameter[1] = 0; // frame period (1000000 * 1 / 25);
	img_fx_param.effect_parameter[2] = 1; // half framerate ?
	if (mmal_port_parameter_set(deint_output, &img_fx_param.hdr) != MMAL_SUCCESS) {
		printf("Failed to configure deinterlacer output port (mode)\n");
		return;
	}

	// Setup image_fx input format (equal to ISP output)
	mmal_format_copy(deint_input->format, isp_output->format);

	status = mmal_port_format_commit(deint_input);
	if (status != MMAL_SUCCESS) {
		printf("Failed to commit image_fx input format!\n");
		return;
	}

	deint_input->buffer_size = deint_input->buffer_size_recommended;
	deint_input->buffer_num = deint_input->buffer_num_recommended;

	// Setup image_fx output format (equal to input format)
	mmal_format_copy(deint_output->format, deint_input->format);

	status = mmal_port_format_commit(deint_output);
	if (status != MMAL_SUCCESS) {
		printf("Failed to commit image_fx output format!\n");
		return;
	}

	deint_output->buffer_size = deint_output->buffer_size_recommended;
	deint_output->buffer_num = deint_output->buffer_num_recommended;

	printf("Create connection ISP output to image_fx input...\n");
	status = mmal_connection_create(&conn_isp_deint, isp_output, deint_input,
	MMAL_CONNECTION_FLAG_TUNNELLING);
	if (status != MMAL_SUCCESS) {
		printf("Failed to create connection status %d: ISP->image_fx\n", status);
		return;
	}

	status = mmal_connection_enable(conn_isp_deint);
	if (status != MMAL_SUCCESS) {
		printf("Failed to enable connection ISP->image_fx\n");
		return;
	}

	// ################################## encoder ######################################################

	// create H264 video encoder component
	status = mmal_component_create(MMAL_COMPONENT_DEFAULT_VIDEO_ENCODER, &pEncoder);
	if (status != MMAL_SUCCESS) {
		printf("Unable to create video encoder component!\n");
		return;
	}

	enc_input = pEncoder->input[0];
	enc_output = pEncoder->output[0];

	enc_input->format->es->video.frame_rate.num = 0;
	enc_input->format->es->video.frame_rate.den = 1;
	enc_input->buffer_size = enc_input->buffer_size_recommended;
	enc_input->buffer_num = enc_input->buffer_num_recommended;

	printf("Create connection ISP output to image_fx input...\n");
	status = mmal_connection_create(&conn_deint_enc, deint_output, enc_input, MMAL_CONNECTION_FLAG_TUNNELLING);
	if (status != MMAL_SUCCESS) {
		printf("Failed to create connection status %d: Deint->encoder\n", status);
		return;
	}

	mmal_format_copy(enc_output->format, enc_input->format);
	enc_output->format->encoding = MMAL_ENCODING_H264;
	enc_output->format->bitrate = 1000 * 1000 * 4; // 4 Mbps output video
	enc_output->format->es->video.frame_rate.num = 0;
	enc_output->format->es->video.frame_rate.den = 1;
	status = mmal_port_format_commit(enc_output);
	if (status != MMAL_SUCCESS) {
		printf("Video encoder output format couldn't be set!\n");
		return;
	}

	enc_output->buffer_size = enc_output->buffer_size_recommended;
	enc_output->buffer_num = enc_output->buffer_num_recommended;

	printf("H264 encoder buffers In=%i | Out=%i\n", enc_input->buffer_num, enc_output->buffer_num);

	// set H264 profile and level
	MMAL_PARAMETER_VIDEO_PROFILE_T param2;
	param2.hdr.id = MMAL_PARAMETER_PROFILE;
	param2.hdr.size = sizeof(param2);
	param2.profile[0].profile = MMAL_VIDEO_PROFILE_H264_HIGH;
	param2.profile[0].level = MMAL_VIDEO_LEVEL_H264_4;
	status = mmal_port_parameter_set(enc_output, &param2.hdr);
	if (status != MMAL_SUCCESS) {
		printf("Unable to set H264 profile!\n");
		return;
	}

	// Set keyframe interval
	MMAL_PARAMETER_UINT32_T param = { { MMAL_PARAMETER_INTRAPERIOD, sizeof(param) }, 500 };
	status = mmal_port_parameter_set(enc_output, &param.hdr);
	if (status != MMAL_SUCCESS) {
		printf("Unable to set intraperiod!\n");
		return;
	}

	enc_pool_out = mmal_port_pool_create(enc_output, enc_output->buffer_num, enc_output->buffer_size);
	if (enc_pool_out == NULL) {
		printf("Failed to create encoder output pool!\n");
	}

	// Enable output port
	status = mmal_port_enable(enc_output, enc_output_callback);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling deinterlacer output port!\n");
		return;
	}

	status = mmal_connection_enable(conn_deint_enc);
	if (status != MMAL_SUCCESS) {
		printf("Failed to enable connection Deint->encoder\n");
		return;
	}

	// ###################################### generic initialization ##################################################

	status = mmal_component_enable(pISP);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling ISP!\n");
		return;
	}

	status = mmal_component_enable(pImageFx);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling deinterlacer!\n");
		return;
	}

	status = mmal_component_enable(pEncoder);
	if (status != MMAL_SUCCESS) {
		printf("Error enabling encoder!\n");
		return;
	}

	// Send buffers for output pool
	for (uint8_t i = 0; i < enc_output->buffer_num; i++) {
		MMAL_BUFFER_HEADER_T *buffer = mmal_queue_get(enc_pool_out->queue);

		if (!buffer) {
			printf("Buffer is NULL!\n");
			exit(1);
		}
		status = mmal_port_send_buffer(enc_output, buffer);
		if (status != MMAL_SUCCESS) {
			printf("mmal_port_send_buffer failed deint on buffer %p, status %d\n", buffer, status);
			exit(1);
		}
	}

	printf("Image converter init OK!\n");
}

/*
 * Converter/encoder deinitialization function
 */
void CloseConverter() {
	MMAL_STATUS_T status;

	printf("Closing converter...\n");

	if (pFile2 != NULL) {
		fclose(pFile2);
	}

	// disable and destroy connection between ISP and deinterlacer
	mmal_connection_disable(conn_isp_deint);
	mmal_connection_destroy(conn_isp_deint);

	mmal_connection_disable(conn_deint_enc);
	mmal_connection_destroy(conn_deint_enc);

	// disable ports
	mmal_port_disable(isp_input);
	mmal_port_disable(enc_output);

	// disable ISP
	status = mmal_component_disable(pISP);
	if (status != MMAL_SUCCESS) {
		printf("Failed to disable ISP component!\n");
	}

	// disable deinterlacer
	status = mmal_component_disable(pImageFx);
	if (status != MMAL_SUCCESS) {
		printf("Failed to disable ISP component!\n");
	}

	// disable encoder
	status = mmal_component_disable(pEncoder);
	if (status != MMAL_SUCCESS) {
		printf("Failed to disable Encoder component!\n");
	}

	// destroy components
	mmal_component_destroy(pISP);
	mmal_component_destroy(pImageFx);
	mmal_component_destroy(pEncoder);
}
Maybe I'm missing any detail or not setting properly some parameters on the processing chain...
I didn't understood yet how I can solve this issue in order to achieve smooth 50fps...
Do you see anything wrong in my code 6by9?
Thanks again for your help.

luiscgalo
Posts: 25
Joined: Fri Jun 22, 2018 9:09 am

Re: Raw sensor access / CSI-2 receiver peripheral

Wed Jul 25, 2018 9:29 am

One idea that comes to my mind (I don't know if it works) which may solve (or not!) the problem could be changing the way of using the image_fx deinterlacer component.

Currently I'm setting "img_fx_param.effect_parameter[0] = 3" which means that the image_fx is expecting the reception of a full HD interlaced input frame containing both top and bottom fields.

However, according to the info from a previous post, there are more options available for this parameter:
6by9 wrote: All algorithms support a first int is an override for the buffer flags.
0 - The data is not interlaced, it is progressive scan
1 - The data is interlaced, fields sent separately in temporal order, with upper field first
2 - The data is interlaced, fields sent separately in temporal order, with lower field first
3 - The data is interlaced, two fields sent together line interleaved, with the upper field temporally earlier
4 - The data is interlaced, two fields sent together line interleaved, with the lower field temporally earlier
5 - The stream may contain a mixture of progressive and interlaced frames (all bets are off).
Since I'm already receiving separated top and bottom fields from the rawcam component, what if I set "img_fx_param.effect_parameter[0] = 1" in order to provide individual fields to the deinterlacer?
Using this approach I will be directly sending received fields to the ISP/image_fx blocks instead of having to merge each top/bottom field into a FullHD interlaced frame.
Another difference will be the fact that I will be feeding the deinterlacer with the final rate of 50fps (field rate) instead of sending 25fps interlaced that are latter on converted to 50fps.
Do you think that this could work?

The problem is that I cannot see a single example about how to use the image_fx deinterlacer with the "img_fx_param.effect_parameter[0]" parameter set to anything different from 3 (from my research, all available source code examples on the Internet are sending the two fields together).
In this scenario, what will be the input and output format of the image_fx block?
Will be the input resolution set to field resolution (1920x540)?
Will be the output resolution set to the target FullHD size (1920x1080)?

Return to “Camera board”

Who is online

Users browsing this forum: No registered users and 12 guests