From Friday, April 19th (11:00 PM CDT) through Saturday, April 20th (2:00 PM CDT), 2024, ni.com will undergo system upgrades that may result in temporary service interruption.

We appreciate your patience as we improve our online experience.

Student Projects

cancel
Showing results for 
Search instead for 
Did you mean: 

A Webcam-based Point-of-gaze Tracking System for Hand-free Board Gaming - Student Design Competition 2013

Contact Information

University: Tsinghua University, Beijing, P.R. China

Team Members (with year of graduation):

Taoyuanmin Zhu (2015);

Qinyi Fu (2015);

Haotian Cui (2015);

Yiyao Sheng (2014);

Faculty Advisers: Shuangfu Suo

Email Address:    ztymyws@gmail.com

Submission Language:English

Project Information

Title: A Webcam-based Point-of-gaze Tracking System for Hand-free Board Gaming

Description:

In the present project, we developed a simplified and economical method of fulfilling eye controlled board gaming that is entirely hand-free and "eyes-only". By locating eye movement against the gaming platform (i.e. the game board) with a webcam, the system tracks the point of gaze of the player and permits the physical movement of on-board pieces in accordance with the movement of the user’s gaze.


Products

LabVIEW 2011

LabVIEW NI Vision Development Module 2011

LabVIEW NI Vision Acquisition 2011

Other Hardware:


Logitech c270 usb webcam

STC89c51 Microprocessor

stepmotor controller

OSRAM infrared LED

We strongly recommend you to watch our demonstration video for better understanding of our work.

-

http://www.youtube.com/watch?v=FQ-CUcy_1TY (For mainland China【Youku】: Click here!)

Concept video of our project. Please notice that not all the functions in this concept video have been implemented.

-

http://www.youtube.com/watch?v=tfX_2p-JUSo (For mainland China【Youku】: Click here!)

I. Introduction

The historical evolution of User Interface (UI) has witnessed tremendous changes in the past half a century. From the non-interactive batch interface that emerged in the middle of the 20th century to the command-line user interface which involves the wide use of keyboards, to graphical user interface (GUI) with the invention of the mouse that finally allowed users to interact with electronic devices using images rather than text commands, early types of User Interfaces had paved the way for methods of human-machine interaction that are simpler, and more efficient and user-friendly. More recently, touch user interface, a type of GUI that accepts touch of fingers or a stylus, has been increasingly used in mobile devices many sorts of other machines. Other trendy types of User Interfaces, such as gesture interface and Kinetic user interface, have proved successful in their preliminary stage in freeing users from external instruments and improving the effectiveness and naturalness of human-machine interaction with a promising future.

An exceedingly exciting concept of the interaction between humans and machines is, however, achieved through eye tracking. With such interface technologies, users will be able to control a system by merely looking, and the computers would react according to their eye position or eye movement without specific commands. The idea is exciting in a sense that no equipment needs to be put in physical contact with the users any longer, and that the users’ hands will be, for the first time, free from any sort of clicking, pressing, touching and waving that is traditionally required in human-machine interaction.

The primary interest of the present project is to achieve board games controlled entirely by users’ eye movement. More specifically, by tracking the point of gaze of the player, the computer reacts with the physical movement of material objects in the board game (e.g. the chess). As a result, the board games would arrive at what we call hand-free and eyes-only gaming.

II. Systematic framework

The project consist of two steps:

    1) Calculating the point-of-gaze from image captured by the webcam

We used a fixed webcam for detecting the coordinates of eyes. In order to obtain the depth information, we added two fixed infrared LEDs. By detecting the glint of LEDs on cornea, 3D coordinates of the pupil were obtained. We also built a model for solving the point-of-gaze on board, which will be discussed in the next section.

    2) Moving the board or chess using physical platform

We implanted a two-axis moving platform underneath the board. In order to move the pieces or chess, magnets were mounted on both platform and chess. Apart from dragging the chess by magnets, we also used magnetic levitating system for better appearance.

III. Modeling

In order to minimize the cost, we used single webcam with two additional LEDs to determine the point of gaze. The simplified model is as follows:

Screen Shot 2013-05-31 at 下午4.01.37.pngfigure 3.1

The detailed modeling process please refer to the documents attached to this post.

In order to fix the parameters in the expressions, we collected different sets of variables when looking at given points. Using regression methods, 6 parameters were fixed. Residuals when looking at different point can be seen from figure 3.2.

Screen Shot 2013-05-31 at 下午4.04.20.png

Figure 3.2 Residuals when looking at different point


As can be seen from the figure, the deviation from real data can be up to 150mm, which is too large for point-of-gaze recognition. We tried different means of data regression, finally obtained better parameters limiting the deviation to 50mm at maximum.

Screen Shot 2013-05-31 at 下午4.04.26.png

Figure 3.3 Residuals of first attempt                   Figure 3.4 Residuals of second attempt

Screen Shot 2013-05-31 at 下午4.04.32.png

Figure 3.4 Residuals of final attempt

Though the deviations are still noticeably large (36.9mm), it is sufficiently accurate for applications as board games.

The model was built assuming that the eyeballs are ideal spheres and the LED light sources are perfectly aligned with the webcam. These assumptions may also lead to defects in modeling.

IV.Eye-tracking algorithm

Eye tracking was achieved by the following methods:

1) The image of resolution 1280*960 was grabbed through USB port at 15fps. The webcam was placed close to face of the participant for better resolution.

Screen Shot 2013-05-31 at 下午4.03.20.png

figure 4.1 step1

2) Notice that the color of the skin is quite different from the background (partly due to the IR light emitted by the LEDs). Threshold was set both on hue and lightness in order to distinguish human face from the background.

Screen Shot 2013-05-31 at 下午4.03.27.png

figure 4.2 step2

3) After sorting out the face, shape matching was applied to find the coordinates of the pupils. In earlier version s of our project, pupils were detected from simple threshold on darkness. We found that, in some cases, it was hard to distinguish pupils from nostrils. By applying shape matching, such defect was prevented and only the round-shaped pupils were detected.

Screen Shot 2013-05-31 at 下午4.03.33.png

figure 4.3 step3

4) The last step is to collect the coordinates of each pupil and the glint of IR LEDs. These three coordinates can be easily tracked regarding their lightness. Finally, we obtained 3 coordinates (6 variables) for further calculations on point of gaze.

Screen Shot 2013-05-31 at 下午4.03.44.png

figure 4.4 step4

V.Implementation of platform for moving pieces

The platform hidden under the board is capable of two DOF planar motions. The two-axis moving platform is driven by two step-motors controlled by MCS-51 Single-Chip Microprocessor. The SCM is connected to the computer via serial port. The program flowchart of SCM is as follows.

Screen Shot 2013-05-31 at 下午4.04.41.png

Figure 5.1  SCM program flowchart (for x axis)

Screen Shot 2013-05-31 at 下午4.03.56.png

Figure 5.2 picture of the moving platform

VI.Game process

  1. Huarong Dao

Huarong Dao (also called “klocki”) is a sliding block puzzle. It is based on a fictitious story in the historical novel Romance of the Three Kingdoms about the warlord Cao Cao retreating through Huarong Trail after his defeat at the Battle of Red Cliffs in the winter of 208/209 CE during the late Eastern Han Dynasty. He encountered an enemy general, Guan Yu, who was guarding the path and waiting for him. Guan Yu spared Cao Cao and allowed the latter to pass through Huarong Trail on account of the generous treatment he received from Cao in the past. The largest block in the game is named "Cao Cao". The player is not allowed to remove blocks, and may only slide blocks horizontally and vertically.

Screen Shot 2013-05-31 at 下午4.04.52.pngfigure 6.1 The wooden traditional game Huarong Dao

Step 1. The player steps into the sight of the webcam and point-of-gaze is captured.

Step 2. After the player gazes at one board for more than 2 seconds, the board is captured by the magnet mounted on the moving platform.

Step 3. After the player gazes at one vacant position for more than 2 seconds, the board is dragged to the position by the magnet mounted on the moving platform.

Step 4. Repeat step 2 and step 3 until “Cao Cao” is moved to the exit.

VII.Features

i.Originality


The originality of the present project is embodied in:

  1. The use of two additional IR LEDs, which helps achieve gaze tracking under certain circumstances and greatly reduces the difficulty of gaze tracking with few limitations on both hardware and software.
  2. Real-time interaction between the user’s intention, the virtual digital world and the material objects.
  3. Experimental trial on “immersive” interfaces where humans’ involvement is completely natural, intuitive and free of equipment physically attached to them.

ii.Advantage


Compared to existing technologies and products, the present project enjoys several advantages:

  1. Simplicity: The whole system requires only one webcam and two LEDs. It is a much simpler way of tracking eye movement.
  2. Low cost: Unlike the eye-movement tracking equipment currently available on the market that generally charges tens of thousands of RMB, the eye-movement tracking device in the present project drastically decreases the cost to around 200 RMB by using an ordinary webcam camera with two LEDs.
  3. Universality: the gaze-tracking device in the present project involves simple measurement of point of gaze with no specific requirement on the users.
  4. Easy popularization: the project is highly adaptive in a sense that the techniques involved have no particular requirements on both hardware and software. Similar methods can be well applied to mobile devices, video games and many other types of machines.

VIII.Future outlook

The idea and technology presented in the project can be utilized in a variety of fields, in spite of the entertaining function the device is originally aimed for.

  1. Apart from all sorts of board games the present project can well applied to, such gaze-tracking technologies can assist computer game and video game players, especially in shooting games, to experience better virtual reality.
  2. In a wider range of fields, gaze-tracking technologies allow humans’ hand-free participation in human-machine interactions and can well replace mouse, touchscreen and touchpad and other input methods. The potential applications thus go beyond the gaming world to include many other gaze-controlled machines such as eye-ordering vending machines and intelligent bookshelves. The technology can also be helpful in assisting the life of the disabled who have lost the use of their hands.
  3. Besides, transforming point-of-gaze tracking method to mobile devices such as smartphones also has a promising future. Most of the smartphones nowadays have already equipped with camera available for eye tracking.
  4. Moreover, gaze-tracking technologies can be used in a wide variety of disciplines, notably cognitive psychology and human-computer interaction, marketing research and medical research (neurological diagnosis). Specific applications include the tracking eye movement in language reading, commercial eye tracking, communication systems for disabled and so on.

IX.Pictures

1.jpg

figure 9.1 moving platform and pieces for Chinese board game "Huarong Dao"

2.jpeg

figure 9.2 USB webcam and infrared LEDs

3.jpeg

figure 9.3 computer running LabVIEW Program

4.jpeg

figure 9.4 system running with voice prompt 1

5.jpeg

figure 9.5 system running with voice prompt 2

6.jpeg

figure 9.6 real-time point-of-gaze tracking (notice that the piece is levitated)

Reference


[1] US Patent 20090100101, Marotta Diane, “Method and System for Gathering, Analyzing and Disseminating Mind-Based Perceptions”, issued on 04/16/2009

[2] Wikipedia-Eye-tracker, http://en.wikipedia.org/wiki/Eyetracker, retrieved on March 21, 2013

[3] Wikipedia-Mind-Control, http://en.wikipedia.org/wiki/Mind_control, retrieved on March 21, 2013

Contributors