Student Projects

Showing results for 
Search instead for 
Did you mean: 

Using LabVIEW and CompactRIO to Put Blind Drivers Behind the Wheel

Contact Information:


University and Department: Virginia Tech Mechanical Engineering

Team Members: Kimberly Wenger (Team Leader), Matthew Becker, Zach Berube, Rett Boehling, Adam Broda, Nina Camoriano, Matthew Dowden, Laura Degitz, Drew Fausnacht, Peter LaVigne, Julie McConaughy, Danny Raynes, Kenny Raynes

Faculty Advisor: Dr. Dennis Hong

Primary Email Address:

Primary Telephone Number: (540) 231-7195


Project Information:


Project Title: Using LabVIEW and CompactRIO to Put Blind Drivers Behind the Wheel

List all parts (hardware, software, etc.) you used to design and complete your project:

NI CompactRIO 9072, NI 9485, NI 9221, NI 9401, Hokuyo UTM-30LX, Celesco String Potentiometer, Hall Effect Sensor, Vicor DC-DC Power System, NI LabVIEW with FPGA and Real-Time Modules, SolidWorks


Describe the challenge your project is trying to solve:




     The National Federation of the Blind (NFB) makes every effort to increase the independence and promote the often underestimated capabilities of the blind, as well as work to break society's stereotypes and inspire innovation in the development of blind access technologies. As a catalyst to the widespread adoption of this initiative, the NFB proposed a challenge to design a system capable of providing the blind with an experience never thought to be possible: the ability to drive. The Robotics and Mechanisms Laboratory (RoMeLa) at Virginia Tech has been the only organization to accept the challenge. Reestablished in 2008 as a senior design team and undergraduate research project within the Department of Mechanical Engineering, the Virginia Tech Blind Driver Challenge (BDC) defined the initial goals for the world’s first working prototype of a blind driver vehicle.


Describe how you addressed the challenge through your project:



     With thirteen undergraduate students and two semesters, the team worked to raise $13,000 USD and delivered a vehicle that could be independently operated by a blind driver.  The blind driver would be expected to safely perform the three fundamental driving tasks: navigate through a curved driving course defined by a single lane of traffic cones, regulate speed within a predefined limit, and exhibit sufficient emergency-stop capability to avoid colliding with an obstacle.


Our Prototyping Platform


     NI products have been used as the singular hardware and software interface for the blind driver system since the project’s inception. We chose NI products because we needed a cost-effective prototyping platform, high speed data acquisition and processing to minimize lag in time-critical driving environments, compatibility with numerous sensors and devices, power and reliability in demanding testing conditions, an intuitive programming interface, modularity across vehicle platforms, minimal size and weight, and available capacity for hardware expansion during future development.


Environmental Perception


     The current blind driver system consists of various sensors and novel nonvisual driver interfaces attached as a modular system to a modified dune buggy or golf cart. We use a Hokuyo UTM-30LX single-plane laser rangefinder (LRF) for environmental perception to scan the driving environment for cones and other obstacles. Thanks to NI engineers, we were able to download a LabVIEW driver for the Hokuyo LRF before the UTM-30LX was even available for purchase by the general public. A laptop running LabVIEW acquires the LRF data and forwards it to the onboard CompactRIO via ethernet.  The laptop also allows a sighted passenger to passively monitor the operation of all hardware and software and easily modify any heuristic-based programming for quick calibration during field testing. Additional sensors gather important information regarding the state of the vehicle, such as speed from a Hall Effect sensor and steering angle from a string potentiometer. We acquire data from these sensors and process it directly using the high-speed FPGA on the cRIO.


Non-visual Driver Interfaces


     After collecting an image of the driving environment using the various sensors, we process the information and transmit it to the driver through non-visual cues. The ultimate goal when developing a non-visual driver interface (NVDI) is to effectively and efficiently provide information to a driver to maximize situational awareness and allow the driver to make quick and precise driving decisions. The array of NVDIs on the second iteration of the vehicle is a combination of informational and instructional cues for safety and redundancy.

     For speed regulation, the driver is given cues based on a need to accelerate and decelerate. This design is meant to increase the intuitiveness of the interface as well as more closely mimic a true driving environment. If the driver is operating below the speed limit, a vibrotactile shoe and right leg strap informs the driver what degree of acceleration is necessary to reach the safe operating speed. If the driver exceeds the speed limit by a small factor, then a vibration in the heel of the shoe will activate to queue the driver to lift their foot off of the pedal. If the driver is operating the vehicle at a speed above the previous factor, the vibrotactile shoe and left leg strap informs the driver what degree of braking is necessary to return to a safe operating speed.

     For steering guidance, a potential field algorithm provides the path generation from the LRF data. After calculating a path, the system instructs the driver where to steer to maintain the lane and avoid obstacles. The driver is informed of the intensity of the upcoming turn via a pair of vibrating gloves called DriveGrip. The DriveGrip system consists of a pair of four-fingered tactile gloves which have motors attached to each of the fingers on the gloves. The motors can be individually actuated to vibrate different parts of the driver’s hand. The system communicates the need to turn right or left through activating these vibrating motors on the respective hand. The driver’s objective is to continually keep the vibration centered between his/her pointer fingers. The severity of the turn dictates how far from “center” the vibrations originate.

     Additionally, we developed a prototype for a tactile map, which is conceptually similar to a high-resolution grid of regenerative braille. The map places an image of the surrounding environment literally in the hands of the driver. Similar to the tiny holes on an air hockey table, a physical map is generated by passing compressed air through small pixels to depict the surrounding obstacles detected by the LRF. This device, appropriately named AirPix, allows the driver to “see” the surroundings and safely navigate through them. The vibrotactile NVDIs are still necessary for redundancy, but using the driver’s high-bandwidth sense of touch through this tactile map technology makes data pathways available for other driving uses, such as listening and interacting with a GPS through voice-recognition software for higher-level path planning. Since the AirPix system is an informational interface with which the driver must interact with quickly and efficiently in the fast paced driving environment, we leveraged the high speed capabilities of the cRIO's FPGA to process data and transmit signals in order to trigger the output grid.


Spin-Off Technologies and Future Plans


     In the months following the 2009-2010 academic year, the Virginia Tech Blind Driver Challenge Team is looking forward to working in conjunction with the National Federation of the Blind to create a vehicle that can be driven from Baltimore to Orlando in the summer of 2011. The team will obtain a Ford Escape Hybrid outfitted with multi-plane and single-plane laser range finders, cameras, and our non-visual driver interfaces. Numerous blind people will be able to test out our interfaces at the NFB National Convention this summer as well as during its high-profile adventure next summer. Whether it was their first time behind the wheel or a long-awaited reunion with an automobile, we hope that their reactions will be overwhelmingly positive and filled with hope. Resulting national and international media coverage is raising tremendous awareness to the capabilities of the blind as well as generating interest in collaboration for the research and development of novel blind access technologies in various applications.

     The numerous potential spin-off technologies were a major emphasis throughout the design process. As these devices are proven to be sufficient in allowing a blind person to drive a vehicle, then we can only imagine the benefits for drivers who have low vision, who are talking or sending text messages on a cell phone, drowsy, or otherwise distracted on the road. We could create early warning devices and collision mitigation systems for all driving environments, especially in bad weather or low-visibility conditions.

     Other than automotive applications, there is also potential for advances with haptic human interface devices, especially for blind pedestrians. The non-visual interfaces could easily be deployed in aircraft cockpits in which the current technology state relies heavily on the pilot's visual capabilities. Sending the high-bandwidth information from the highly saturated visual environment across the other senses will greatly increase a pilot's situational awareness, which is a critical aspect when operating any vehicle. The DriveGrip system could also be used to teach a blind person to play a musical instrument. The AirPix interface could be applied to a classroom setting to display a tactile image for students.  Though we may not see blind drivers on the road for many years, the potential spin-off technologies are suitable for immediate use in countless applications.



VT BDC Photo:


628356-good-low qual.jpg


VT BDC Videos:



Video 1:  2009-2010 Virginia Tech Blind Driver Challenge




Video 2:  Virginia Tech Blind Driver Challenge on the CBS Early Show



could u pls share the vi code?


NI Employee (retired)

Hi tweet.  I don't believe the code is publicly available yet; however, the Hokuyo driver is now available on NI's Instrument Driver Network if you're interested in how the vehicle was able to 'see'.  Thanks for your interest!  Check out the latest with the Blind Driver Challenge here.

- Greg