i have to work on vision based project to measure the lengh of the object.In this project camera position is not stable,and zooming also varied.in this case how to make accurate distance measurement.?
now i developed the edge detection method to measure distance between two points .but if zoom is changed ,its wont give proper value...
so pls tell me what steps i need to do..
expecting your replay..
I think your question might receive a quicker answer by asking it in the NI Forums. I would suggest the Machine Vision section:
Philippe is absolutely right. However, I cannot resist...so here is my response.
Ideally, you need to calibrate the image in order to get robust results. This will also help with the unstable camera position. Calibration involves adding a calibrated feature in the image. The information about the calibrated feature and the measurement values help to compute the measurements of interest more precisely.
Here are some helpful links:
Calibration features: http://digital.ni.com/public.nsf/allkb/815CFD0DACE4AFB08625770F004EE84E
If you have a feature of a known length, let's say on the platform where you put your test object, you can calculate the offset/factor by which the measured length has changed and apply the same offset/factor to the feature you are trying to measure. You could create markings at known distances on the platform, at about the same depth as the feature you are trying to measure.
Hello, I see you do not find the solution, try with it Kinect sensor is a good alternative, you can consult the following link
I Would like to know how to measure the distance between camera and image for tracking process.Do u know any method for measurement and answer me.
Distance is difficult to measure with a standard single camera. Stereovision would be better.
You might want to try a marker, ARmarker for instance. But that probably hasn't any Labview support.