Вы находитесь на странице: 1из 4

Eye Tracking Applications

David Scully
University of Limerick Castletroy Limerick, Ireland 0748323@studentmail.ul.ie

James Conway
University of Limerick Castletroy Limerick, Ireland 0863513@studentmail.ul.ie

ABSTRACT
Over the years, there have been many innovations in the eld of human computer interaction that may have seemed like magic at some point in time but we as a technologically savvy populus survey the constant stream of new commercial applications with a practical and skeptical eye. However, the advent of usable eye tracking software stands out to some as a level of scientic achievement that borders on the supernatural. This paper aims to quantify the direct eect of its availability to developers, consumers and commercial enterprises.

Keywords
Eye tracking, gaze detection, gesture tracking

1.

INTRODUCTION

This paper will explore modern eye tracking technology and seeks to evaluate the scope of its potential applications. It will outline the historical chain of events that has led to the recent prevalence of research in this area. This will be followed by an examination of pioneers in eye tracking research and the technologies and techniques they employed. A series of case studies will illustrate the range of purposes to which this burgeoning technology can be applied. The implications of imminent applications of this technology for the future of human-computer interaction as well as the benets oered to diering demographics will be discussed.

ments in reading. They varied the typeface, print, size, page layout, etc. and studied the resulting eects on reading speed and eye movement patterns. In 1947 Paul Fitts et al. began using motion picture cameras to study the movements of pilots eyes whilst they were using the controls and instruments in the cockpit to land a plane [11]. This study can be seen as the earliest application of eye tracking to further enhance usability and improve product design. The year 1948 saw the rst head-mounted eye tracker; invented by Hartridge and Thompson. Although crude in design, it was the rst step toward freeing the user from the tight constraints of head movement that previous methods required [9]. Studies in eye movement and eye tracking saw great advances in the 1970s as research in psychological theory used eye tracking data to identify cognitive processes [15]. With the boom of research in the elds of psychology and physiology during this decade, studies into how the human eye operates were conducted to determine a relationship to perceptual and cognitive processes. The 70s also saw many technical improvements to increase accuracy and precision and to reduce the obtrusiveness of tracking devices. Senders noted that the use of eye tracking technology appears every decade to help solve new problems [6]. The 1990s saw technological advances such as the Internet, emails and video conferencing. Researchers again turned to eye tracking to help understand and develop questions about usability and to explore its potential as an input device.

2.2

Technology and Techniques

2.

BACKGROUND

This section will provide an overview of the developments in the study of eye-tracking from its earliest manifestations to its current state incorporating the methodologies and processes used.

2.1

Brief History

The history of eye tracking and the study of eye movements predates the use of personal computers by almost 100 years. However, initial methods used were quite invasive. In 1879, French ophthalmologist, Louis Emile Javal and fellow researchers used direct mechanical contact with the cornea to reect the eye movement [15]. In the 1930s Miles Tinker, renowned expert on the legibility of print, with his colleagues applied photographic techniques to study eye movePermission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for prot or commercial advantage and that copies bear this notice and the full citation on the rst page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specic permission and/or a fee. NIME12, May 21 23, 2012, University of Michigan, Ann Arbor. Copyright remains with the author(s).

Tracking eye movements can be accomplished using several techniques. The most common of these methods is the video-based, non-contact, optical method. Infrared light reected by the retina is monitored by a video camera and analyzed to assess eye-rotation. Another type of eye tracking known as Electrooculography (EOG) requires the use of electrodes around the eyes to monitor the electric potential eld that travels between the eyes and the brain. The benet of this method is that it does not require the eyes to be open; allowing for subjects eyes to be tracked while asleep [10]. Other methods use special contact lenses tted to the iris and pupil. These lenses can contain magnetic sensors or mirrors and can be tracked through analysing video or magnetic sensor data [13]. Methods of eyetracking that require electrodes or tted lenses are more invasive for the user and are thus unlikely to become a common technology in personal devices in the near future. To apply eye tracking data to the control of devices or applications, a subjects head must be stationary or the device itself must be head mounted. Another more aordable, but less accurate, technique that allows for a user to control a device with their eyes is gesture-tracking [3]. Using a video camera, a device can detect deliberate facial and eye movements and link these movements to actions or tasks in a piece of

software. This technique does not necessarily require the subjects head to be stationary or the device to be headmounted but it does not provide precision tracking data. For this reason, in the elds of marketing and advertising research, head-mounted methods are more common [3].

mented: A multi-layer step-sequencer. This allows for the creation of a rhythm section on a separate layer to the melodic accompaniment, each with their own controls. Real time control over rhythmic, harmonic and melodic aspects of the composition. The performer should also have control over articulation such as glissando and vibrato. Big buttons in the user interface. This is an important factor when designing UIs that utilize eye tracking as a pointer and, in the case of the EyeHarp, large buttons reduce the possibility of the accidental playing of neighbour notes; a likely occurrence due to the jittery nature of the eye tracking system.[14] This project allows disabled users such as quadriplegics to create music using their eyes. They can communicate with a melody and step sequencer interface and control different sound settings and musical events using eye movements. Like its predecessor, the eyeWriter project, the EyeHarp allows artists with paralysis opportunity to express themselves through music. All of the projects code is open source, allowing for further development and personalisation.

3.

CASE STUDIES

The following is an examination of a shortlist of eye tracking applications and systems.

3.1

openEyes

openEyes is an open-source, head-mounted, eye tracking system [12]. The goal of this project was to develop a system using cheap o-the-shelf products and open source technologies and algorithms. They hoped to motivate and enable designers to to explore the potential of eye movements for improving interface design and that this will lead to an increased role for eye tracking in the next generation human computer interfaces [12]. This project was completed in four generations and the nal product could be considered a spiritual predecessor to Googles Project Glass. The device is mounted on a pair of glasses and links to a desktop or laptop computer that powers the software necessary to apply the eye tracking data to action and complete tasks. The openEyes system uses infrared video eye tracking as infrared imaging eliminates uncontrolled specular reection by actively illuminating the eye with a uniform and controlled infrared light not perceivable by the user [12]. This reduces the negative eects of the uncontrolled nature of ambient light and, therefore, gives the device greater versatility for use in a variety of real-world situations. This technology is similar to the technologies used in Google Glasses. The openEyes developers admitted that completing tasks with their device proved dicult as it lacked the ability to track a users head movements and thus could not interact with external devices without the use of a magnetic or optical tracker. Googles Project Glass has set about solving this issue; integrating magnetic trackers and accelerometers. The openEyes project was truly a success in motivating developers of consumer electronics to integrate eye tracking into the emerging, cutting-edge devices of today that future generations will take for granted.

3.3

Fixational

3.2

The EyeHarp

The eyeHarp is a new musical instrument which uses gaze and eye tracking data as the system of control. This system emphasizes low-cost and ease of construction[14]. The eye tracking device used is a low-cost design based on the work of the EyeWriter project[2]. Their website oers clear instructions to create such a device. The software employed for tracking accesses the libraries developed by the EyeWriter project; which was released as open-source using OpenFrameworks. The eye tracking device uses infrared LEDs to illuminate the eye and create a dark pupil eect. The video camera then tracks the position of the pupil and uses a calibration sequence to map the tracked eye/pupil coordinates to positions on the computer screen[14]. This system requires calibration upon rst use; a subject is asked to focus on a sequence of points on the screen and positions of the pupil are recorded. The user can then interact with the computer using their eye and begin to create music using the musical interface. The authors goal was to create a real musical instrument with the same expressive power as traditional musical instruments[14] which would be suitable for performing in a band or as a standalone composition tool. To achieve this goal the following tools and design features were imple-

Fixational is an Irish software development company that specialises in gesture control technologies for iOS. Their agship product is Wink Camera; the wink triggered camera app for iPhone and iPad [3]. The engine that powers this application was developed by a team of two developers at xational. It is designed to operate in the background of iOS applications and observe the user through the front facing camera, looking for deliberate eye gestures such as winks which can be linked to any app function [3]. This software was developed prior to the recent hype over eye-gesture controlled applications surrounding the mobile device market this year. Ronan OMalley, CEO of Fixational, originally intended to develop eye tracking software to utilize a users web-cam in advertising and marketing research. This software would provide advertisers with useful eye tracking information from a consenting participant; this method would be much quicker and cheaper than more complex techniques that are widely used in advertising and marketing research. In 2011 the purpose of the software changed from research to control. The developers began working towards an engine for eye-gesture controlled applications. This software diers from our other case studies in several ways: It was designed as a product for consumers rather than a tool for research. It also uses dierent methods of control. This software gleans little control data from eye tracking; it can recognise deliberate eye-gestures, detected by tracking the users gaze and facial movements, and assign predetermined functions [3].

3.4

Tobii

Tobii Technology is a company based in Stockholm, Sweden. It is the world leader in eye tracking and gaze interaction. Tobiis products are widely used in various elds of research. At the time of writing this paper, they provide market-leading eye tracking technology to industrial partners in areas such as consumer electronics, gaming, vehicle safety and medical diagnostics. With products like the Tobii PCEye, they oer users with disabilities interaction with a computer and enhanced communication with others. The product, PCEye comes with the award winning

Gaze Selection software. This software makes it possible to control your desktop, or any other application. Users can benet from a more relaxed access to the computer as the software incorporates a two-step process which makes navigation more intuitive. Gaze Selection uses a unique zoom functionality that gives pixel-precise control of where the user points, clicks and drags, enabling full computer access[1]. Tobii Technology announced the rst video-based eye tracker to be fully integrated into an ordinary glasses frame. The system consists of a pair of glasses and a small, pocket-worn device for video processing and data collection[7] (see gure 1). This form of wearable technology will enable one of the most cost-eective eye tracking systems to aid researchers in a variety of industries, as well as help organizations gain a better insight into the preferences, reactions and personal experiences of people in mobile and natural settings. They are designed to resemble common modern eyewear in the hope that wearers will take advantage of them on a daily basis. Tobii have made considerable progress in making wearable eye tracking a feasible and unobtrusive means of gathering gaze data. It could be seen as another predecessor to Googles Project Glass.

available, simply cannot. It will also enhance user safety and comfort for existing powered wheelchair users. The concept of utilising eye tracking to aid people with disabilities is not unprecedented. As described in the case studies, eye tracking technologies are already being used by disabled persons to operate computers and express themselves through music. However, these applications have been concerned with communication. Steve McHughs proposed application would allow severely dependant wheelchair users a level of unaided physical access to a wider range of environments. This is just one example of an application of eye tracking in a real-world scenario through a personal consumer device that could greatly improve the quality of life of a particular demographic. Implications of innumerable applications of eye tracking for every demographic are inherent to these technologies.

4.2

HCI

The technology for using eye tracking in the conventional WIMP interaction is here now as demonstrated in the case studies. The next few years will see the development of ideas which utilize personal devices and eye tracking for a number of applications. Eye tracking will provide users with Zs more spare bandwidth [8]. For example, Fixational aA new project, a AYWink Reader aAZ, will allow users to scroll through the page of a document by looking towards the top or bottom of the screen and turn the page by winking. This leaves hands free for other tasks; a chef handling food can read through a recipe on his or her phone or tablet without compromising hygiene. Digital art is another area which will benet from the advancements made in eye tracking technology. An example of this can be seen in the Sideways Project, where users can just walk up to a display and immediately interact using their eyes, without any prior user calibration or training. [16] Marketing, design and medical research will continue to benet from these technological advancements and possibly gain a greater insight into how customers, users and patients interact and perceive visually.

Figure 1: Tobii Glasses[5]

4.3

Impact in the Mass Market

4.

DISCUSSION

This section will classify three distinct areas of application in the eld of eye tracking.

4.1

Disabilities

One of the key areas primed to benet from the development of applications for eye tracking technologies is the improvement of support systems for people with disabilities; specically the introduction of systems operated through eye tracking rather than touch. Google hopes to launch project glass by early 2014. Depending on the accuracy and precision of the eye tracking technologies in the glasses, Steve McHugh, an engineer at the Boston-based e-commerce company Wayfair, hopes to develop an application that will enable quadriplegic and other disabled persons to control their powered wheelchairs (start/stop, speed, turning) while displaying real-time feedback about their surroundings (dangers, obstacles, suggested routes) [4]. However, the existence of motorised wheelchairs controlled through eye tracking software appears to be an inevitable eventuality and is therefore not dependant on the success of Google Glass in particular as this budding technology meets a pre existing need. It opens up the possibility of operating a powered wheelchair to those who, with products currently

With recent developments in input methods for interactive devices, touch screen and voice recognition technologies have crept into a variety of types of consumer products. Nearly all new mobile phone and tablet devices are equipped with touch screens as the main input method and Apples Siri and Google Voice have introduced voice recognition to tablet and mobile devices for the global market. Televisions and video game consoles are beginning utilize gesture and voice control. Technologies that consumers see every day are advancing at a rate that is increasing exponentially. However, the constant, incremental change in how we interact with consumer devices continually prepares us for acceptance of new, novel, interactive approaches; we, as consumers, have adjusted to the concept of continual adaptation. The Samsung Galaxy S4 was rumoured to incorporate eye tracking in its Smart Scroll feature. However, press release and development models were revealed to use gaze detection and accelerometers rather than true eye tracking. Nonetheless these rumors increased consumer awareness of eye tracking as a potential input method. The competitive market of consumer electronics is based on supply and demand and now that users are aware of this possibility, the necessary demand for eye tracking has begun to grow. The rst product aimed to meet this rising demand in the mass market will likely be Googles Project Glass. As mentioned earlier, benets for disabled users have been es-

tablished. However, once this technology reaches global consumers, the possibilities for potential applications will begin to present themselves. Google, Apple and Microsoft have developed mobile devices to serve as frameworks within which independent developers can innovate solutions to specic needs. As many applications today concern enhancement of lifestyle for nite demographics, the scope for innovation stretches from globally important issues to weekly fashion fads. For eye tracking, the employment of a similar business model will open development in the applications of its use to an equally broad and varied pool of developers. Only then will its full impact be revealed. In case studies eyeHarp and openEyes the developers used open source software and low-cost material to build their systems and have released their work as open source libraries. These resources provide a head start for future developers of eye tracking applications. The foundations laid as a result of these projects are available for independent software designers to refocus to suit their chosen requirements. Tobii focuses on creating precision eye tracking systems. Unfortunately, the exorbitant price of Tobiis Glasses has limited its availability for the mass market. This product is being used primarily for research and by wealthy gadget enthusiasts. Conversely, Google Glasses are aimed at the average mobile device user; this will most likely be reected in its price. One would expect the eventual release of a range of glasses with dierent levels of technological complexity and capability within a relative range of prices. This would increase the accessibility of eye tracking technologies for the average consumer. Fixational have developed an eye tracking engine for iOS devices. This engine is a specialised piece of software that works with hardware that was not designed to accommodate eye tracking applications. With their clever combination of gesture tracking and eye tracking, their products can achieve a high level of precision. However, with the coming advent of eye tracking devices in the consumer market, new hardware designed specically for tracking the human eye will be able to achieve greater levels of precision, eciency and control. According to Ronan OMalley, CEO of Fixational, the fundamental pivot upon which any device that showcases eye tracking technology will succeed or fail, is its immediate usability to the uninitiated: It can be magic, but it has to make sense and feel right. Most consumers wont learn complex awkward controls. [3] This point of view is central to the commercial mass market mind. The burgeoning demand will be met by developers whose approach accommodates a gentle learning curve. However, what constitutes a gentle learning curve is evolving as quickly as the technologies that require it. For commercial success to be assured, it is important that the learning curve of a particular application is not perceived to be steep.

bounds of the future applications of eye tracking are indeterminate. What is assured is this technologys integration into the modern professional and private everyday lives of consumers to the point of ubiquity. For some this technology will be essential, as in the case of McHughs eye-operated motorised wheelchair. For others, this technology will add a layer of luxury to leisure pursuits. Whatever the purpose, the opening up of these new technologies to this generation of developers is imminent and will enable another burst in technological advancement and integration of technology.

6.

REFERENCES

5.

CONCLUSION

This paper explored modern eye tracking technologies and sought to evaluate the scope of its potential applications. The concept of eye tracking was not born in this century. The evolution of this technology initially occurred as a byproduct of research to psychological and physiological ends. However, a tipping point has been reached whereby it has become its own eld of research. Examination of the case studies highlights the approaches, technologies and techniques employed to date. This analysis illustrates that the

[1] Control your PC with your eyes - tobii PCEye. http://www.tobii.com/assistivetechnology/global/products/hardware/pceye/. [2] EyeWriter. http://www.eyewriter.org/. [3] Final yearinterview with ronan OMalley, CEO of xational. http://ruaille.com/post/44794252708/ronomal. [4] Google glass app to make wheelchairs eye-controlled proposed by engineer | BostInno. http://bostinno.streetwise.co/2013/03/13/googleglass-app-to-make-wheelchairs-eye-controlledproposed-by-engineer/. [5] Tobii showcasing tobii glasses at shopper insights in action 2010 conference - softpedia. http://gadgets.softpedia.com/news/Tobii-ShowcasingTobii-Glasses-at-Shopper-Insights-in-Action-2010Conference-10746-01.html. [6] The minds eye cognitive and applied aspects of eye movement research. North-Holland, Amsterdam; Boston, 2003. [7] A. Bulling and H. Gellersen. Toward mobile eye-based human-computer interaction. 9(4):8-12, 2010. [8] P. Cook. Principles for designing computer music controllers. In Proceedings of the 2001 conference on S4, New interfaces for musical expression, page 1 aA 2001. [9] H. Drewes. Eye gaze tracking for human computer interaction. PhD thesis, lmu, 2010. [10] A. T. Duchowski. Eye tracking techniques. In Eye Tracking Methodology: Theory and Practice, pages 5565. Springer, 2003. [11] C. Ehmke and S. Wilson. Identifying web usability problems from eye-tracking data. In Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI... but not as we know it-Volume 1, pages 119-128, 2007. [12] D. Li, J. Babcock, and D. J. Parkhurst. openeyes: a low-cost head-mounted eye-tracking solution. In Proceedings of the 2006 symposium on Eye tracking research & applications, ETRA 06, pages 95100, New York, NY, USA, 2006. ACM. [13] A. Poole and L. J. Ball. Eye tracking in HCI and usability research. 2006. [14] Z. Vamvakousis and R. Ramirez. The eyeharp: An eye-tracking-based musical instrument. In 8th Sound and Music Computing Conference, 2011. [15] N. J. Wade and B. W. Tatler. Did javal measure eye movements during reading?, 2009. [16] Y. Zhang, A. Bulling, and H. Gellersen. SideWays: a gaze interface for spontaneous interaction with situated displays. 2013.

Вам также может понравиться