Spaces:
Sleeping
Sleeping
Ticket Name: TDA2SX: Please help in determining capture to display latency in TDA2 for automotive applications | |
Query Text: | |
Part Number: TDA2SX Hi, I have a question on how the latency between a camera capture and display, can be measured. I came across an article on TI wiki, that defines what i need: http://processors.wiki.ti.com/index.php/Latency_Measurement_on_Capture_Encode_Decode_Display_Demo. I would like to know if the same solutions exist for TDA2x processors and If yes, can you please share. Thank you in advance. Mahima | |
Responses: | |
Hi Mahima, Are you using vision sdk? If yes, it prints latency from the capture to display when you print stats. Regards, Brijesh | |
Just to add.. You would see the local link level latency and the source to link latency (which is equivalent to capture to particular link latency) in these statistics printed out from Vision SDK. Example is as below: [IPU1-0] 176.749191 s: Local Link Latency : Avg = 30 us, Min = 30 us, Max = 30 us, [IPU1-0] 176.749313 s: Source to Link Latency : Avg = 32971 us, Min = 32971 us, Max = 32971 us, Regards, Piyali | |
Thanks Brijesh. Yes, vision sdk is used; but, in development environment. Please let me explain my set-up, with more clarity: 1. I have to measure glass to glass latency on a bench setup using 4 cameras. The camera output goes to the ECU which has a TDA2x processor. The output images from the ECU, goes over LVDS to a monitor. 2. I only have access to M4-0 and A15 core logs thru a serial board. The rest are not accessible while testing the product. 3. I went thru the vision sdk user guide and it outlines a separate hardware setup to execute the example use cases. This, I do not have. Is it possible, barring the above limitations, to capture the glass to glass latency? I’m pretty new to this. Sorry, If I’ve not asked the right question. Thank you once again -Mahima | |
Thanks Piyali, for taking time out to answer. I've outlined, with more clarity, the limitations associated with capturing the latency in my setup. Kindly go thru, and let me know your suggestions. Thanks again. -Mahima | |
Mahima, If you are using vision sdk, just press 'p' when usecase is running, it will print all statistics including latency from the capture to display. This is the latency from capture link to the display link. There will be additional 2 to 3 frames latency in capture and display, which is not counted in this stats. The other way to measure the latency is by keeping a counting clock in front of the camera and take a picture of clock and display output in single shot. The latency is difference between time in clock and display. Regards, Brijesh | |
Thanks Brijesh. I think the system we use is a little different. we have a renderer in between the capture and display. I think vision sdk loses the frame once it goes inside the renderer. But the second method would give us a fair estimate though. So, thanks again for the help. | |
Hi Mahima, It is possible even if you have rendered in between. You just need to copy srctimestamp from source frame to target frame in your rendered link. Regards, Brijesh | |
Thanks Brijesh. I will try this on my system and let you know if I could capture the same. | |