LabVIEW Robotics Documents

cancel
Showing results for 
Search instead for 
Did you mean: 

Kinect Drivers for Labview

OK, I been researching for a while, and I would like to create the drivers for the Kinect for use them in LabVIEW, the task is not easy at all, it's a challenge. But, having the drivers allow us to comunicate the Kinect with LabVIEW in a directly way, is the first step, the second step, is to make an easy program that allow us to move the motor, or to turn on the LED in different colors.

If anybody is interested in helping with this task, here are a couple of links that might be helpful for the challenge.

So far, OpenKinect is the website which have a lot of information related with the task, here is the link below where they explain how to send the data.

http://openkinect.org/wiki/USB_Devices

When you install the driver, the Kinect's LED  will be blinking, that means that your driver was succesfully installed, the drivers come in .inf files, so, the correct way to install them is right click in a .inf file and later, click in the install option. Connect your Kinect, don't let the computer to install the drivers automatically, you need to install them by yourself, so look for them in the folder where you saved them before.

USB Kinect.png

Here is a  video that I made using the Drivers from OpenKinect, the work is not ready yet, but is going to be ready soon.

I'm using the occupancy algorithm that I did using the theory.

Is the second approach building a Map using the Sensor Kinect with feedback from an IMU.

"Everything that has a beginning has an end"
Comments
Member eek
Member

Thank you for sharing, anfredes! I featured your code on the LabVIEW Robotics blog as well. Take a look:

http://labviewrobotics.wordpress.com/2011/04/07/more-labview-development-for-the-xbox-kinect/

Member JasonC1017
Member

Thanks for sharing.  Very curious on how you did the depth map and if you have any code to share on making a good depth map (I'm trying to make one from stereo-pairs).

Thanks,

Jason

Member RoboticsME
Member
Member JasonC1017
Member

Another quick question....anyone know the depth resolution of kinect system (i.e., color intentisity change of x = so many mm of distance)?  In otherwords, how fine of a depth distance can it detect?

Thanks,

Jason

Member RoboticsME
Member

So far I've found a non-linear relationship.  It seems to be pretty fine up close (e.g. ~8,000 divisions across 15 inches).  However, further away it seems to be much less precise.  More like a tenth of that resolution.  These numbers are approximate, but are in the right ballpark for order of magnitude.

Member RoboticsME
Member

I made a mistake.  Take my last post and divide by 32 (I missed that in looking over the code).  Therefore, roughly 250 divisions over 15 inches, etc.

I'm currently using an 8th order polynomial fit for the data, and it seems to be working well.  I'll post a video soon.

http://decibel.ni.com/content/blogs/MechRobotics

Member Freexerohmatic
Member

this is amazing

trying to do the same thing right now but instead of haivng the kinect at 90 degrees (like its ment to) the plan is to have the kinect face downwords to map a feild

any chance you can help?

kinds new to labview, use to C++ so its kinda difficult getting my head around this in labview

Member RoboticsME
Member

That sounds like a cool application! Do you know if you'll have good localization or reliable odometry?  If so, it shouldn't be too hard to use that information to build a map. 

Member Freexerohmatic
Member

Yes

there are planning to use some shaft encodors and gyros

i just notice how bad my discription was

the plan is to have the kinect high so the it can see over objects and map the area before moving (hence facing it downwards but not straight down)

cheers

Member RoboticsME
Member

If you can rely on integrating the data from the shaft encoders and gyros to give you your robot's position, then this should be pretty easy.  However, because of wheel slippage and gyro drift, this might not be feasible.  If you're mapping over a short distance, the drift and slippage may be small enough that you won't notice any accumulated error.  Therefore, if you assume you always know where your robot is, you can populate a map with the measurements you take from the Kinect.  We did something similar (in 2D) for NIWeek: https://decibel.ni.com/content/docs/DOC-13089 and https://decibel.ni.com/content/docs/DOC-13031

Otherwise, I think you're looking for a SLAM solution (Simultaneous Localization and Mapping).  There are a various approaches to implementing this and lots of papers you can easily find.  It doesn't seem like there is one standard reliable way of implementing SLAM, but EKF and GraphSLAM are both pretty popular.  Here's a PDF someone put together that explains some approaches: http://ocw.mit.edu/courses/aeronautics-and-astronautics/16-412j-cognitive-robotics-spring-2005/proje...

Wikipedia also has a decent overview.

Member Freexerohmatic
Member

thanks for the help

Member Ree_A
Member

Hello Everyone,

I am trying to connect the Kinect to my PC but the kinect i bought does not have a cable that i can use . is there a particular cable that i need and where can i buy it. I live in Australia.

Cheers,

Thanks

Member superjing
Member
Member Ree_A
Member

Thank you Superjing.

I will try check in a few stores here and see if they have it. Which port does the middle part of the cable connect to ( which Kinect port)

Will keep u updated once i get the cable..

Cheers

Member eek
Member

Hey everyone,

During the 2012 NIWeek keynote, students at the University of Leeds unveiled their own LabVIEW interface for the Kinect.

They've made it available for download via the LabVIEW Tools Network.

You can find more info and download the eval here: http://sine.ni.com/nips/cds/view/p/lang/en/nid/210938

Contributors