I have two images with the same element. I want to detect contours of element on both images and compute contour distances.
First screen shows fragment of my code. Second screen shows fragment which shows which pair of points is taken to compute distance. Points apearing on template and target image with delay i see which two points are taken to calculate distance.
Unfortunately it seems that almost the same point are taken on template image as on target image. I thought that it control should compute distance between corresponding points belonging to contour. So if contour is rotated distance will be big cause two the same contour points will be taken to compute.
Documentation lacks of explaining idea of using contour distances, I haven't found also valuable example.
The question is how are points choosen to compute distances, what is wrong with my code? How can I obtain result i want?
I haven't experience in LabView development but have some in Computer Vision with OpenCV. For me it's big disadvantage that there are some controls for which documentation doesn't show ideas behind it.
The most consistent reference literature is NI Vision Concepts Manual. As far as calculating the distance between the points of the contours, the most reliable solution is to do the calculation directly, without any VIs. I have attached an example.
Thanks for great example.
But it still does not answer my question.
I see how you compute contours distances manually. But how do you know you are computing contours between corresponding points. In attached image I've drawn what I mean by corresponding points. I thought that LabVIEW finds corresponding points by contours matching so I've provided Match Contour Setup Data to Compute Distances control in my vi. I see that you compute distances between points with the same indexes in both contours.
What information can be taken from this? What if contours orientation changes? Can you provide also images you've worked on. Maybe they will help to understand what you achieved with computing distances.
So another question what can be taken from contour distances. I've found example in examples library which found some defect of circular object with computing distances between circle (model) and detected contour. But with circle it's clear. What if shape is not symetric and is rotated as I've attached. What more information can be taken from contours distances.
Compute contour distance locates the template contour on target image using contour matching algorithm (based on Geometric Pattern Matching). Matching algorithm takes care of shift, rotation, scale and occlusion.Once the match is found there is refinement algorithm for accurate correspondance generation between template contour points and target contour points. After completing one to one correspondance, distance will be calculated.