Machine Vision

Showing results for 
Search instead for 
Did you mean: 

Pick and place using labview and Ni vision acquisition

Go to solution

Hi everyone,


I am doing a student project  on Vision guided pick and place of an industrial robot(abb) . I would like to know the steps involved in creating the block.

I have to locate the object , get its cooordinates through webcam. Then does a pattern matching.. and would send the cooordinates to microcontroller . then from microcontroller to robot controller.. then the industrial robot should pick the object and place it in a predefined place..

I would be extremely grateful if you guys can help me since  I am new to LabView. 






0 Kudos
Message 1 of 5
Accepted by ImPradeepm

What you are describing is fairly involved, but here's are some tips.  The key is to correlate the robot's coordinate system to the camera's coordinate system.  I assume the camera is statically located above the pick-up area?  I would move the robot to each corner of the frame to its pick position vertically, and note the robot's position at those locations.  These 4 points in space will be correlated to the X,Y coordinates of the camera frame's pixels.  You basically need to write a sub-VI with the inputs being pixel X and Y coordinates, and the output being the robot coordinates.


Write a test application telling the robot to go to any X,Y pixel location in the frame to test your sub-VI.  If that is working, then you need to set up a pattern match.  You will likely want to do a geometric pattern match.  Have a look at this example:


You will need your pattern match algorithm to return both the coordinates for your robot, and the orientation of the tool required to properly pick up the object (if the pick and place robot tool requires that it be in a specific orientation).  So its basically up to you to convert the object's X,Y,and rotation angle in the frame that you get from the pattern match to whatever coordinate system the robot uses.  


The placing algorithm could just be a orientation adjustment to put the object into placement orentation, and then the placement positions could be an array of robot coordinates that you iterate through after each pick.


Make sure to implement some safety mechanisms in your algorithms so that the robot can never move somewhere outside a safe range of coordinates.

Message 2 of 5

What have you got so far?

From looking at your workflow you have 4 key sections

1) Object detection

2) Object recognition

3) Serial? Communication

4) Microcontroller stuff


1 and 2 are linked together.

First you need to use something like IMAQ (Not sure if it's the vision development or vision acquisiton that it comes with) to detect objects within the field of view of the webcam, perform some kind of filtering to only keep the one(s) you want (so for instance you could remove all objects with are too large or small). The location of the object is then fairly easy to get (in pixel coordinates).


Before you do 3, you need to work out what it is you want to tell the microcontroller. Just the current location of the object? Current and new location? etc. Depending on how the microcontroller is programmed you may have to convert the object location(s) into something useful for the microcontroller. For example, does the industrial robot operate on a raw x-y-z or does it use angles for its joints?


Microcontroller stuff - You need to work out how it communicates with the robot and program it appropriately. Keep in mind you'll need to program it to receive communication from LabVIEW.



Detect objects -> Filter objects -> Determine location of object to "pick" -> Generate commands to send to microcontroller ->send to microcontroller



receive from LabVIEW -> generate "pick" command -> send "pick" command to robot -> wait (for it to pick) -> generate "place" command -> send "place command" -> wait (for it to place)


then repeat as necessary.

Message 3 of 5

In the following we proceed to function block search pattern extracted in the previous process (the parameters as rotation angle and minimum score is inserted into SETTINGS control), extract the output of the search function to get the position values indicators that will be displayed on the front panel)



0 Kudos
Message 4 of 5

hi i know it was a long time ago but im also doing a students project for a robotic arm pick and place and if you could help me out that would be great ty

0 Kudos
Message 5 of 5