LabVIEW

cancel
Showing results for 
Search instead for 
Did you mean: 

3D position using camera system - real time

Greetings!

 

I amtrying to use Labview to track an object in 3D space.  I am using threehigh speed cameras which are connected via a framegrabber to my PC.  Theidea is that I want to be able to throw an object, let’s say a ball, and haveLabview detect the motion and return real time stats on the object (location inxyz and the velocity vector).  The object should be about the size of abird as this will be the final object tracked.

 

I amcurrently constrained as to where the cameras can be.  They have to bepositioned in a horizontal line beside one another, but they can be turned toany angle I see fit. 

 

Ihave some ideas of how to approach this but since this is my first visualproject I am a little unsure what the best method will be.

 

Oneidea I had was to have one camera centered facing strait forward and one cameraon either side if it turned 45 degrees toward the center line of vision. Thenthe idea would be to calculate the actual x-y-z position based on the geometryof the camera.  However this method isless than flexable.

 

Anotheridea was to eliminate restrictions on the camera positions and have some sortof target system set up which would then "calibrate" the cameras foreach particular setup. I am thinkingof x and y rulers printed out and hung from the ceiling.  Then, using the distances between x ruler markingscompared to y ruler markings to determine the camera angle and what the zcomponent should be. Maybe use a basketball to calibrate instead?

 

Solutions?Working code that takes jpegs from three saved camera files?  Any help atall in 3D positioning would be much appreciated. 

 

Thanks inadvance!!

~Chad

0 Kudos
Message 1 of 6
(3,490 Views)

Chad,

 

Having all three cameras in a straight line is a poor choice for 3D work.  Try putting one at (1,0,0), one at (0,1,0) and one at (0,0,1) or some similar distribution.  Then the angle from the camera axis gives you a starting point for your 3D geometry.

 

Can you arrange that the object is always in the field of view of all three cameras?  Do the cameras/lenses distort off-axis images?

 

If the object is a known size and has a simple geometric shape (sphere, cube, tetrahedron), you may be able to also use size of image to get an estimate of distance.  Rotation of more complex shapes makes for a difficult calculation.

 

The rulers might be helpful if the object size is known.  If the object size is unknown or variable, the challenge is much greater.

 

Lynn 

Message 2 of 6
(3,481 Views)

Sorry, I don't know what happened to my first post.

 

In response to the question of camera positioning, the reason they are in a line is due to the fact that this system will eventually be sitting on a ledge watching the birds come up to the window. 

0 Kudos
Message 3 of 6
(3,476 Views)

I would put two on one side of the window and one on the other side.  That gives at least two dimensions. It also makes it a little less likely that the birds will perch (and do messy things) on the cameras.  The third dimension - away from the wall - will be a tricky one.

 

Lynn 

0 Kudos
Message 4 of 6
(3,472 Views)
I should be recieving an engineering article today where two cameras were used at 45 degree angles to accuratly measure/position an object in 3D space.  Has anyone done this in Labview?
0 Kudos
Message 5 of 6
(3,444 Views)
Any advice from the article?  I haven't been able to find anybody who has done this before.  I would first think about how to accomplish this theoretically, and try to implement it in LabVIEW after that.  Right now I am not quite sure how you are going to do this with your cameras all in a line.
Nick Keel
Product Manager - NI VeriStand and Model Interface Toolkit
National Instruments
0 Kudos
Message 6 of 6
(3,421 Views)