OK, I been researching for a while, and I would like to create the drivers for the Kinect for use them in LabVIEW, the task is not easy at all, it's a challenge. But, having the drivers allow us to comunicate the Kinect with LabVIEW in a directly way, is the first step, the second step, is to make an easy program that allow us to move the motor, or to turn on the LED in different colors.
If anybody is interested in helping with this task, here are a couple of links that might be helpful for the challenge.
So far, OpenKinect is the website which have a lot of information related with the task, here is the link below where they explain how to send the data.
When you install the driver, the Kinect's LED will be blinking, that means that your driver was succesfully installed, the drivers come in .inf files, so, the correct way to install them is right click in a .inf file and later, click in the install option. Connect your Kinect, don't let the computer to install the drivers automatically, you need to install them by yourself, so look for them in the folder where you saved them before.
Here is a video that I made using the Drivers from OpenKinect, the work is not ready yet, but is going to be ready soon.
I'm using the occupancy algorithm that I did using the theory.
Is the second approach building a Map using the Sensor Kinect with feedback from an IMU.
Thanks for sharing. Very curious on how you did the depth map and if you have any code to share on making a good depth map (I'm trying to make one from stereo-pairs).
Another quick question....anyone know the depth resolution of kinect system (i.e., color intentisity change of x = so many mm of distance)? In otherwords, how fine of a depth distance can it detect?
So far I've found a non-linear relationship. It seems to be pretty fine up close (e.g. ~8,000 divisions across 15 inches). However, further away it seems to be much less precise. More like a tenth of that resolution. These numbers are approximate, but are in the right ballpark for order of magnitude.
trying to do the same thing right now but instead of haivng the kinect at 90 degrees (like its ment to) the plan is to have the kinect face downwords to map a feild
any chance you can help?
kinds new to labview, use to C++ so its kinda difficult getting my head around this in labview
That sounds like a cool application! Do you know if you'll have good localization or reliable odometry? If so, it shouldn't be too hard to use that information to build a map.
If you can rely on integrating the data from the shaft encoders and gyros to give you your robot's position, then this should be pretty easy. However, because of wheel slippage and gyro drift, this might not be feasible. If you're mapping over a short distance, the drift and slippage may be small enough that you won't notice any accumulated error. Therefore, if you assume you always know where your robot is, you can populate a map with the measurements you take from the Kinect. We did something similar (in 2D) for NIWeek: https://decibel.ni.com/content/docs/DOC-13089 and https://decibel.ni.com/content/docs/DOC-13031
Otherwise, I think you're looking for a SLAM solution (Simultaneous Localization and Mapping). There are a various approaches to implementing this and lots of papers you can easily find. It doesn't seem like there is one standard reliable way of implementing SLAM, but EKF and GraphSLAM are both pretty popular. Here's a PDF someone put together that explains some approaches: http://ocw.mit.edu/courses/aeronautics-and-astronautics/16-412j-cognitive-robotics-spring-2005/proje...
I am trying to connect the Kinect to my PC but the kinect i bought does not have a cable that i can use . is there a particular cable that i need and where can i buy it. I live in Australia.