From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Student Projects

cancel
Showing results for 
Search instead for 
Did you mean: 

Lightweight Mobile Eye Tracking system

Lab

Contact Information

University: Lomonosov Moscow State University, Physics department.

TeamMember(s): Andrey Somov

FacultyAdvisors: associate professor Pavel M. Mikheev

EmailAddress: somov@automationlabs.ru

Project Information

            The goal of this project was to create a mobile eye-tracking system, that would precisely determine the point of gaze (“where we are looking”) in real-time. Initial purpose of the system was to measure and analyze driver’s attention distribution, but it can be applied to product design, psychology and even to lie detection. 

Products:

Software:

  • NI LabVIEW 2010 Professional
  • NI Vision 10.0

Hardware:

  • Glasses
  • Two analog micro cameras (9x9mm, 450 TVL)
  • Two AverMedia USB TV Tuners

OR

  • NI PXI-1031DC 4-slot PXI Chassis
  • Two NI PXI-1411 1-channel image acquisition boards

The Challenge:

            Most of the existing eye-tracking devices are either costly or bulky. Some of them require a user to wear a heavy helmet, others even require one’s head to be fixed. Obviously, these devices are inconvenient and can not be used while driving a car. Therefore, our challenge was to design a lightweight eye-tracking system and develop an effective algorithm of gaze point calculation.

The Solution:

            The general idea used in this project is to acquire a high quality video image of a user’s eye and a video image of the field of view. The image of an eye is analyzed and relative coordinates of a pupil are calculated. Then the point of gaze on the second image is evaluated, using calibration data.

The image of the eye needs to be contrast regardless of lighting conditions. That is why we used an infrared illumination and embedded an IR light filter in the “eye” camera. To mount two micro video cameras on a user’s head we used ordinary glasses. We removed the lenses, attached two analog cameras with an infrared LED and hid all the wiring in a spectacle frame (Fig.1,2).

Fig1.jpg

Fig.1

Fig2.jpg

Fig.2. The Eye Tracker and Andrey Somov.

            Video signal can be transmitted via radio channel or it can be wired directly to image acquisition board. Then the images are processed in real-time on a laptop.

Eye-detection algorithm

           

            Designing the eye-tracking algorithm we took advantage of NI Vision Development module.

First step of the image processing is using the IMAQ Local Threshold function, that applies an adaptive threshold to a grayscale image. This function requires a lot of processor time to be executed and the larger the image the more time it requires. That is why before applying the threshold image resolution is reduced to four times.

The second step is to apply a particle filter to distinguish the pupil from other dark particles and estimate its center of mass. This measurement is imprecise due to reduced resolution.Moreover, center of mass of a detected particle does not always match the real center the pupil because of glare of the IR illumination and other lighting effects (Fig. 3).

Fig3.png

Fig.3. The glare greatly afects the estimated center of the pupil.


The third processing operation determines edges of the pupil and fits them with an ellipse. To do so we apply IMAQSpoke 3 function to the original high-resolution image. This function finds contrast edges along the radial lines, which diverge from the estimated center of the pupil. Then we use 50 to 70 edge points to find the ellipse that best represents the set of points. Center of this ellipse is suggested to be the actual center of the pupil (Fig.4). As a bonus, we can calculate ellipse axes and pupil area, which represent useful data for lie detection.

Fig4.png

Fig.4. The final image. Edges found by IMAQ Spoke are small green dots.

Red dot is the center of the ellipse found. The glare does not affect its location.


Calibration method

In order to know precisely what a subject is looking at, some calibration procedure is required in which the subject looks at a series of points, while the eye tracker records the relative pupil coordinates that correspond to each gaze position. Obviously, the more points we record the more accurate calibration we can obtain. In order to make the calibration process easy, fast and automated a special calibration algorithm has been developed.

All you need to do to perform the calibration procedure is to stare at a special template and turn your head from side to side for about a minute. The template represents a black circle with a small white dot in the center. While you are looking at the dot and turning your head at different angles the eye tracker automatically detects the template and corresponds its center to current pupil position. This algorithm obtains several hundreds of calibration points per minute (Video 1).

Video 1. The calibration procedure.


Then the polynomial interpolation is used to calculate the points of gaze for every possible pupil position. If the calibration procedure is performed carefully precision of 0,5 degree can be achieved. Below you can find a video we recorded for our colleagues from  The Moscow State Automobile and Road Technical University (Video 2). The are intended to use our project to examine drivers' attention distribution.

Video 2. Watch it on YouTube for high resolution.


Why LabVIEW

            The easiest decision we have made working on this project is which programming environment to use. NI LabVIEW allows extremely fast programming and easy modification of complicated code, and integration with vision acquisition hardware is not a problem. We took advantage of using advanced built-in tools for linear algebra calculations, vision processing and some others. Having all that complex algorithms on my “Favorites” palette really makes me feel the power at my fingertips. Also, LabVIEW gives outstanding opportunities to quickly create user interface.

Comments
LPS
NI Employee (retired)
on

Hello there,

 

Thank you so much for your project submission into the NI LabVIEW Student Design Competition. It's great to see your enthusiasm for NI LabVIEW! Make sure you share your project URL with your peers and faculty so you can collect votes for your project and win. Collecting the most "likes" gives you the opportunity to win cash prizes for your project submission. If you or your friends have any questions about how to go about "voting" for your project, tell them to read this brief document (https://forums.ni.com/t5/Student-Projects/How-to-Vote-for-LabVIEW-Student-Design-Projects-doc/ta-p/3...). You have until July 15, 2011 to collect votes!

 

I'm curious to know, what's your favorite part about using LabVIEW and how did you hear about the competition? Great work!!

 

Good Luck, Liz in Austin, TX.

Contributors