From 04:00 PM CDT – 08:00 PM CDT (09:00 PM UTC – 01:00 AM UTC) Tuesday, April 16, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

Labview 8.6 Webcam to control a puppet

Hi guys.

I'm really new to Labview 8.6 and seem to always get confused with the many modules Labview has.

For my assignement, I'm required to control a puppet by the movement of my body.

The idea I have in my head would be:-

 

Webcam uses edge detection and filters out everything except my body's silouette.

When I move my right hand 45 Degrees upwards, the puppet follows. The puppet's hand is attached to a servo-motor by a string which spins to either give more length thus lowering the arm and vice versa.

Labview is required to send out binary to the PIC which controls the servo-motors.

 

 

I'm having a delima right now because I'm not sure if I should be using Labview VI for this or NI Vision Assistant or NI Vision. I'm really new to labview and I'm not getting much help from seniors and lecturers. Lecturers in my college has never used Labview and seniors all used Mathlab for Image Processing

 

It would be nice if somebody could point me in the right direction and/or give me tutorials relating to my assignment.Suggestions are most welcomed. 🙂

 

Thank you.

-Wolf

0 Kudos
Message 1 of 4
(4,068 Views)

Using LabVIEW would be best since you are not only doing image processing but also communicating with the PIC. 

 

In order to use the webcam, you will need to use the NI-IMAQ for USB Cameras driver. This software will allow you to configure any DirectShow imaging device and acquire images into LabVIEW.

 

If you want to use basic functions for image manipulation, use the NI Vision Acquisition Software. If you want to use advanced image processing functions, use the Vision Development Module. I am not sure what exactly you want to do, so I would suggest you download and run these software and figure out which one is appropriate for you. From what I understand, you might want to use the Vision Development Module.

 

Next, we need to know how you are going to send the binary data to the PIC. What communication protocol are you using to talk to it?

 

Note: All the software suggestions are for version LabVIEW 8.6.

Message Edited by Adnan Z on 11-12-2009 10:40 AM
Adnan Zafar
Certified LabVIEW Architect
Coleman Technologies
Message 2 of 4
(4,042 Views)

Hi Wolf,

 

LabVIEW is much better suited for this application than Matlab. If it was simply motion detection or motion history imaging i wouldve recommended Matlab not because labview performs any less, but because as you have mentioned you have labview support available at your college. But since you need to talk to the pic which will make the puppet do things, LabVIEW is much more suited.

 

After youve made up your mind on the software platform you need to set the algorithm straight. I donot know the complete details of your project. But it is important to know whether you would be performing the movements in controlled enviornment or outdoors. This is important because in controlled atmospheres background contrasts make the edge detection easier and much more reliable. out in the open there are way too many variables for comfort.

 

Perhaps you would require to focus more on morphological operations. My perception is that you would want to make the puppet move three parts with respect to your movement. The Head and the two arms. Some time i came across an algorithm which didi skin detection. As the face and hands(some times till elbows) are exposed mostly. The skin is visible and it is detected by the algorithm. The skin is isolated and its motion can be tracked. The tracked motion can then be sent to the pic for making the puppet move.

 

I used this algorithm on matlab some time back and the results were pretty good.

Regards
Asad Tirmizi
Design Engineer
Institute of Avionics and Aeronautics

" Its never too late to be, what u want to be"
Using LabVIEW 8.2
0 Kudos
Message 3 of 4
(4,034 Views)

To Adnan Z,

Yes I have downloaded and installed the NI-IMAQ for USB Cameras driver a while back but in some cases when I try to run a provided example, I come across the No Interface Error when using IMAQ for USB but in other IMAQ USB examples, its fine.

 

I don't intend to manipulate the image but instead trace multiple locations real-time. So based on that I guess Vision Development Module would be my pick. If I had the VIsion Development Module installed, it would appear in my functions pallete under the "Vision and Motion" drop down menu? Machine Vision and Vision Utilities, yes?

 

Well my assignment consists of 3 members. I'm suppose to handle the Image Processing which in general is suppose to send an output to my other partner who's task is to convert my outputs(be it binary, hex, etc) into something the PIC can understand. The PIC section is the 3rd partner of this assignment. If I'm not mistaken, my 2nd partner is going to communicate with the PIC via rs232 which sends ASCII.

 

I've thought of a 2 ideas which might or might not be applicable into Labview on my part. Please advice.

1) Have each of my hands and legs wear a completely different colour and therefore Labview would accurately know which body part I'm moving thus controlling the outpu.

2) Have a grid system? Say when certain grid is filled with an obstacle(eg. my hand ), it would output the way I want it to?

3) Combine both ideas and have a quadraple grid layer? Means 1 grid just for my left hand which is signified by red and has an area tolerance of only my left hand side hence if I moved my left hand towards my RHS, it wouldn't cause the output to send as if my right hand moved into that area.

 

 

To Asad_Tirmizi,

Well, my college has little knowledge of Labview so I'm pretty much only able to learn from myself and this forum. Some of the other students are doing image processing as well but their task is to use Mathlab and its look so much easier but to be honest, I wouldn't really know since I've studied Mathlab a lil before but have completely no knowledge with Labview.

 

Yes, it will be in a controlled environment. I'll probably have a black cloth hanging in the background to make the detection more accurate due to the contrast. How does edge detection fair off in labview? I've been looking up for tutorials or even complete projects by other students on Youtube.com but I've yet to found any team using Edge Detection. Most of the complete projects however are always related to single colour tracking but mine on the otherhand requires me to track 4 points simultaneously. Is it possible?

 

My intentions would be to move 4 parts of my puppet in 2D motion. Both hands and both legs. An algorithm for skin detection? Do you mean only the silouette of the body is picked up and others like for example your collar bones's edge is ignored? Please elaborate more on this in views of Labview. I'm intrigued at least now I know it is possible to track multiple points using edge detection with Labview.

 

Message Edited by Wolfie10k on 11-16-2009 12:46 PM
0 Kudos
Message 4 of 4
(3,985 Views)