Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

object tracking and recognition

i am new to this labview . what are the procedures i have to do while doing tracking on moving object whether there is any example code to do this using machine vision development tool

0 Kudos
Message 1 of 11
(9,339 Views)

 

First you need to create a filter that can find the object(s), in at least 1 frame, with a high degree of probabillity.

 

Typically the object have a color, or a static shape, that you can use to identify it.

if not, edge detection can help you isolate object candidate outliers, but will often also identify alot of edges that is not the outlier of you'r object.

There are mutliple built in methods for pattern matching, or color channel extraction, or edge enhancement, or even isolating a very specific color in an image.

Which method to choose depends on what you are trying to track.

If you wan't more advanced solutions, like tracking humans, you will not get it out of the box.

 

When the object has been either isolated, enhanced, or found, typically you have a resulting gray image, where white is high probabillity of object, and black is low probabillity.

Then mulitple methods are provided with labview to find closed shapes, like particle filters, to get the coordinates and sizes of a coherent blob of white.

This gives you the object location in the image.

With some built in methods like patternmatching, you get the coordinates from the beginning, if the pattern is present in the image.

 

now you will typically know approximately where the object should be in the next frame, so you can minimize the search to be around that area, and if not found, expand it, or simply assume that your model is good, and that the frame is a glitch, and hope that the next frame allows you to recognize the object again.

Even perfect filters needs to consider this, as objects can be partially or fully occluded in some frames.

 

Engineer, M.Sc. Autonomous Systems, Automation and Control of non-linear systems
Project Engineer @ R&D A/S
www.rdas.dk
0 Kudos
Message 2 of 11
(9,314 Views)

I too am interested in object tracking but I would like to specify a bit.

 

I am not very familiar with LabView (yet) but I have been asked to track the movements of a joystick and would like to use LV to do it.  I am hoping that the NI Smart Camera combined with the LabView software has the ability to do this.  

 

The tracking must be accurate to within 1mm.

The distance from camera to target must be about 1m.

I must be able to record and display the data recorded.

The movements must be separated into two axes.  (recorded separately)

 

Please reply with any suggestions for specific software/hardware to use and with the estimated difficulty level.  (since i'm new)

 

Thanks!

0 Kudos
Message 3 of 11
(9,271 Views)

You have to carefully select the viewangle.

Distance of camera vs. accuracy is a matter of resolution of the camera, lens choice, and lighting conditions.

However the user tend to cover parts of the joystick with the hand, making feature tracking very difficult.

If the camera is placed above the joystick, the user often bends over the joystick, blocking the view completely.

 

Why do you wan't to use a camera for this? - typically the joystick have potentiometers or optical measurements that are more accurate, and faster..

 

Depending on the shape of the joystick, and the user beahviour, and the lighting quality, and the money available for selecting resolution and lens, the task is below medium to solve in labview, if you have the vision packages.

way above average if not.

Engineer, M.Sc. Autonomous Systems, Automation and Control of non-linear systems
Project Engineer @ R&D A/S
www.rdas.dk
0 Kudos
Message 4 of 11
(9,231 Views)

I will actually be tracking the work equipment control lever on an excavator.  I don't have the machine available but I believe the lever is about 2ft long.  The endpoint of the lever will make about a .5ft radius of motion.  For this particular machine there is no electrical output from the control lever so I can't just read the signal from it as I would normally.  Ideally, there would be a realtime output from the software that I could record along with the actual hydraulic output.  I could then record the changes between lever input and hydraulic output as we tune the machine.  If the realtime output is not possible or too difficult I could post process the data but it would be a hassle.

0 Kudos
Message 5 of 11
(9,222 Views)

I see.

 

I am not sure that you want a vision solution.

By "real time" we are talking hydraulic time constants, that are how - long ?

you do know that using vision, you'll probably get a time lag of (1 / FPS) + Image analysis computation time, and that may be 1/10th of a second, when you need to track the lever, and compensate for perspective.

To get a faster respons, and much simpler analysis, use 2 cameras, and only analyse the lever angle in 1 plane pr camera.

(the other solution without perspecitve problems, requires you to mount a camera somewhere where the lever can be seen from either top or bottom, and that does not sound feasible)

Remember that if the escavator is being used in varying lighting conditions, you will face weather and time dependent problems tracking the lever.

 

you should build a small gimbal, that copies the movement on each axis to a secondary joint and use a digital odometer of some sort.

The difficulty of building this mechanically and robustly is small compared to the difficulty of getting the tracker to work robustly and accurately enough, under varying light conditions.

 

 

 

Engineer, M.Sc. Autonomous Systems, Automation and Control of non-linear systems
Project Engineer @ R&D A/S
www.rdas.dk
0 Kudos
Message 6 of 11
(9,209 Views)

Hello venky88an,

 

i just saw that the new VDM 2013 features a new object tracking library based on mean shift algorithm (basic and extended with kalman filter).

 

Maybe this could help you in your project.

 

Best regards,

K


https://decibel.ni.com/content/blogs/kl3m3n



"Kudos: Users may give one another Kudos on the forums for posts that they found particularly helpful or insightful."
0 Kudos
Message 7 of 11
(9,199 Views)

If the camera were fixed to roof of the machine directly above the lever I don't believe there would ever be any obstruction of view.  So it could be viewed from the "top".  Unless I am misunderstanding the problem.  Also, could I not solve the lighting issues by shining a bright light on the target from the POV of the camera?

 

By "real time" output I just meant that I would like some type of measureable XY output with as little time lag as possible.  I will record this signal along with the actual hydraulic output using a separate data logger.  

 

i don't think the time lag will be that big of an issue as long as it is consistent.  I can adjust for it afterwards. 

 

Overall though, this is sounding more complicated than I want to make it. 

0 Kudos
Message 8 of 11
(9,192 Views)

Hello,
I'm new to labView and started working on it for about 4 months.Actually I'm stuck at some point and need help/suggestion.
Im trying to track a magnet float in a tank filled with water(will be uploading the Image below).

Issue:


1. Only able to track for few time duration because the tamplate which I have selected(magnet float) will be keep on changing its appearence as the water level increases or decreases(because the camera will be fixed at one point ,may be viewing angle matters here)
2. There is a markings on the tank which I need to consider so I cant fix the camera position too far from the tank as the visibility of the markings will decrease.
3. Not able to track the tamplate while water is filling the tank because the tamplate will not be clearly visible due to disterbence caused by water.

Any suggestions??

Thanks in advance:)

0 Kudos
Message 9 of 11
(6,319 Views)

You added to an old post.  Starting a new post would be more effective.

 

I would turn the tube so you get a clear view without the scale obstructing it.  Put a solid color background behind the column.  It should be fairly easy to create a binary image that separates the float from the background.  Find the widest section and you are done.  Might not work for every case, but would work pretty well for most.

 

Bruce

Bruce Ammons
Ammons Engineering
0 Kudos
Message 10 of 11
(6,293 Views)