Академический Документы
Профессиональный Документы
Культура Документы
Muchun Su1,2, Chinyen Yeh1, Shihchieh Lin1, Pachun Wang3, Shawmin Hou3
1
Department of Computer Science & Information Engineering, National Central University,
Taiwan, R.O.C.
2
Graduate Institute of Biomedical Engineering, National Central University, Taiwan, R.O.C.
3
Cathay General Hospital, Taiwan, R.O.C.
E-mail: muchun@csie.ncu.edu.tw
352
present image frame and used as the first four
“templates” (as shown in Fig. 4) to determine the
possible position of the eye region in the next image
frame. Since the illumination condition and the
distance between the camera and the user may vary
from time to time during the operation procedure, the
templates should be updated in accordance with the
environmental changes. Otherwise, the templates may
become out of date and wrong eye regions may be
located.
In fact, an intuitive and simple approach for
generating the template for the eye region is to use a
box around the center of the working eye region.
However, from many experimental results, we found
that the performance of the simple template was not as
Fig. 3 The “voice messages” selection and high as we expected. One possible reason is that the
its subsequent selections updated template may gradually lose the representative
of the pupil and then detect wrong regions. That is why
we adopted the four templates since good results could
be expected.
353
move in a similar manner. A pyramidal
implementation of a hierarchical optical flow method 3.2 Typing test
[34]-[35] is used to automatically track the 25 anchor
points uniformly distributed in each rectangle-shaped The subjects were asked to use a scanning spelling
template. If the average moving length of the 25 program to type “ci lab”. The program organizes the
anchor points is larger than a threshold, θ of , then a alphabets into 3 groups. Each group contains 6 rows.
blink motion is claimed to be detected. One complete In this spelling program, it takes 2 seconds for a scan.
eye blink involves two motions: open-closed followed Six strokes require 18 selections. Without any error in
closed-open. detecting blinks, it totally requires 98 seconds to
An example of the average motion length across complete the typing task. The average typing time
time is depicted in Fig. 5 where two consecutive peaks across the subjects was 114.5 seconds. It indicates that
represent an eye blink. Moreover, the number of some blinks were miss-detected so the program took
frames lying between the two peaks may serve as an time to jump back to previous layers. The experimental
indication of a voluntary blink or not. A prolonged result shows that the performance rate can reach
blink with more than three frames between the two 94.75% success rate.
consecutive peaks indicates a voluntary blink. If the
average motion length is less than the threshold, θ of , 4. Conclusions
then a blink is not detected. Therefore, we need to go In this paper, an implementation of a low-cost eye-
back to the previous step to search the eye region in a blink-based communication aid for ALS patients is
larger region. If the frequency of miss-detected of the presented. Experimental results show that it can be
eye region is higher than a threshold, θ md , then the used to manipulate the computer for people via
system will automatically go to the first step to ask the blinking eyes.
user to voluntarily blink to generate another four new
templates. 5. Acknowledgements
voluntary blink voluntary blink involuntary blink involuntary blink
This paper was partly supported by the 96CGH-
NCU-A3, the National Science Council, Taiwan,
R.O.C, under the NSC-96-2221-E-008-017, the NSC-
96-2752-E-008-002-PAE, the NSC-96-2524-S-008-
002, and the NSC-96-2422-H-008-001.
6. References
Fig. 5 An example of the average motion [1] W. J. Perkins and B. F. Stenning, “Control units for
length across time and two consecutive peaks operation of computers by severely physically
indicate a complete eye blink handicapped persons,” J. Med Eng. Technol., vol. 10,
no. 1, 1986, pp. 21-23.
3. Experimental results [2] O. Takami, N. Irie, C. Kang, T. Ishimatsu, and T.
Ochiai, “Computer interface to use head movement for
3.1 Eye blink detection test handicapped people,” in Proc. IEEE TENCON’96,
Digital Signal Processing Applications, vol. 1, 1996,
The experiment was conducted to test whether the pp. 468-472.
eye blink detection algorithm can successfully detect
eye blinks under different conditions. Four subjects [3] D. G. Evans, R. Drew, and P. Blenkhorn, ”Controlling
mouse pointer position using an infrared head-operated
were asked to blink and each one was recorded in two
joystick,” IEEE Trans. on Rehabilitation Engineering,
different lighting conditions. We collected a data set vol. 8, no. 1, 2000, pp. 107-117.
consisting of 8 image sequences taken under varying
lighting conditions. The sequences were manually [4] Y. L. Chen, F. T. Tang, W. H. Chang, M.K. Wong, Y.
examined offline to determine when and how many Y. Shih, and T. S. Kuo, “The new design of an
eye blinks happened. The total numbers of frames and infrared-controlled human-computer interface for the
eye blinks in these testing sequences were 560 and 40, disabled,” IEEE Trans. on Rehabilitation Engineering,
respectively. The experimental result shows that the vol. 7, Dec. 1999, pp. 474-481.
performance rate could reach 97.5% success rate.
354
[5] R. B. Reilly and M. J. O’Malley, “Adaptive noncontact
gesture-based system for augmentative [18] Y. Tomita, Y. Igarashi, S. Honda, N. Matsuo, “
communication,” IEEE Trans. on Rehabilitation Electro-Oculography Mouse for Amyotrophic Lateral
Engineering, vol. 7, no. 2, 1999, pp. 174-182. Sclerosis Patients ”, in the 18th Annual International
Conference of the IEEE Engineering in Medicine and
[6] M. C. Su, W. C. Cheng, P. Z. Chang, L. Z. Chang, Y. Biology Society, 1996, pp. 1780 -1781.
W. Huang, and C. Y. Tew, “A simple and inexpensive
telephone dialing aid for the disabled,” in IEE [19] K. S. Park and K. T. Lee, ”Eye-controlled
Computing & Control Engineering Journal, vol. 11, human/computer interface using the line-of-sight and
no. 2, April 2000, pp. 73-78. the intentional blink”, Computer Engineering, Vol. 30,
No.3, pp. 463-473, 1996.
[7] M. C. Su, C. Y. Chen, S. Y. Su, C. H. Chou, H. F.
Hsiu, and Y. C. Wang, “A Portable Communication [20] M. Betke, J. Gips, and p. Fleming, “The camera mouse:
Aid for Deaf-Blind People,” in IEE Computing & visual tracking of body feature to provide computer
Control Engineering Journal, vol. 12, no. 1, February access for people with severe disabilities,” IEEE Trans.
2001, pp. 37-43. on Neural Systems and Rehabilitation Engineering,
vol. 10, no. 1, 2002, pp. 1-10.
[8] M. C. Su, Y. H. Lee, C. H. Wu, S. Y. Su, and Y. X.
Zhao, “Two Low-Cost Human Computer Interfaces for [21] K. Grauman, M. Betke, J. Gips, and G. R. Bradski,
People with Severe Disabilities,” Biomedical “Communication via eye blinks- detection and duration
Engineering – Applications, Basis & Communications, analysis in real time,” Proc. CVPR 2001, 2001, pp. I-
Vol. 16, No. 6, Dec. 25, 2004, pp. 344-349. 1010-1017.
[9] Eye-Trace System, Permobil Meditech AB, Timra, [22] T. N. Bhaskar, F. T. Keat, S. Ranganath, and Y. V.
Sweden, http://www.algonet.se/~eyetrace. Venkatesh, “Blink detection and eye tracking for eye
location,” TENCON 2003, pp. 821-824.
[10] L. Young and D. Sheena, “Survey of eye movement
recording methods,” Behav. Res. Meth. Instrum., vol. 7, [23] M. C. Su, S. –Y. Su, and G. –D. Chen, “A low cost
no. 5, 1975, pp.397-429. vision-based human-computer interface for people with
severe disabilities,” in Biomedical Engineering-
[11] T. Hutchinson, K. P. White Jr., W. N. Martin, K. C. Applications, Basis, & Communications, Vol. 17, No.
Reichert, and L. A. Frey, “Human-computer interaction 6, 2005, pp. 284-292.
using eye-gaze input,” IEEE Trans. Systems, Man,
Cybernetics, vol. 19, no. 6, 1989, pp. 1527-1533. [24] M. C. Su, S. –Y. Su, and G. –D. Chen, “A low cost
vision-based human-computer interface for people with
[12] G. A. Rinard, R. W. Mateson, R. W. Quine, and R. S. sevee disabilities,” in Biomedical Engineering-
Tegtmeyer, “An infrared system for determining ocular Applications, Basis, & Communications, Vol. 17, No.
position,” ISA Trans., vol. 19, no. 4, 1980, pp. 3-6. 6, 2005, pp. 284-292.
[13] C. H. Morimoto, D. Koons, A. Amit, M. Flickner, and [25] T. Brandt, R. Stemmer, and A. Rakotonirainy,
S. Zhai, “Keeping an eye for HCI,” in Proc. XII “Affordable visual driver monitoring system for fatigue
Brazilian Symp. Computer Graphics and Image and monotony,” in Proc. IEEE lntemational
Processing, 1999, pp. 171-176. Conference on Systems, Man and Cybemetics, vol. 7,
2004, pp.6451-6456.
[14] D. Kumar and E. Poole, “Classification of EOG for
human computer interface,” in the Second Joint [26] K. Fukuda, J. A. Stern, T. B. Brown, and M. B. Russo,
EMBS/BMES Conference, vol. 1, Oct. 2002, pp. 23-26. “Cognition, blinks, eye-movements, and papillary
movements during performance of a running memory
[15] J. R. LaCourse and F. C. Hludik Jr., “An eye task,” Aviation, Space, and Environmental Medicine,
movement communication-control system for the vol. 76, July 2005, pp. C75–C85.
disabled,” IEEE Trans. on Biomedical Engineering,
vol. 37, no. 12, 1990, pp.1215-1220. [27] M. F. Funada, S. P. Ninomija, S. Suzuki, K. Idogawa,
Y. Yam, and H. Ide, “On an image processing of eye
[16] P. DiMattia, F. X. Curran, and J. Gips, An Eye Control blinking to monitor awakening levels of human
Teaching Device for Students Without Language beings,” in Proc. 18th Annual International Conference
Expressive Capacity: EagleEes, Lampeter, U.K.: of the Engineering in Medicine and Biology Society,
Edwin Mellen, 2001. vol. 3, 1996, pp. 966–967.
[17] G. Norris, E. Wilson, “The eye mouse: an eye [28] Q. Ji, Z. Zhu, and P. Lan, “Real-time nonintrusive
communication device”, IEEE 23rd Northeast monitoring and prediction of driver fatigue,” IEEE
Bioengineering Conference, May 1997, pp. 66-67.
355
Transactions on Vehicular Technology, vol. 53, no. 4, [32] M. J. Black, D. J. Fleet, Y. Yacoob, “A framework for
July 2004, pp. 1052–1068. modeling appearance change in image sequences,”
International Conference on Computer Vision, 1998.
[29] H. Lim and V. K. Singh, “Design of healthcare system
for disable person using eye blinking,” in Proc. Fourth [33] T. N. Bhaskar, F. T. Keat, S. Ranganath, and Y. V.
Annual ACIS International Conference on Computer Venkatesh, “Blink detection and eye tracking for eye
and Information Science, 2005, pp. 551–555. location,” Conference on Convergent Technologies for
Asia-Pacific Region, vol. 2, 2003, pp. 821-824.
[30] P. Smith, M. Shah, and N. da Vitoria Lobo,
“Determining driver visual attention with one camera,” [34] B. D. Lucas and T. Kanade. “An investigation of
IEEE Transactions on Intelligent Transportation smoothness constraints for the estimation of
Systems, vol. 4, no. 4, December 2003, pp. 205–218. displacement vector fields from image sequences,”
IEEE Trans. on Pattern Analysis and Machine
[31] R. Heishman and Z. Duric, “Using image flow to detect Intelligence, vol. 8, 1986, pp. 565-593.
eye blinks in color videos,” IEEE Workshop on
Applications of Computer Vision, 2007. [35] J. Y.Bouguet, “Pyramidal Implementation of the Lucas
Kanade Feature Tracker Description of the algorithm,”
Intel Corporation Microprocessor Research Labs.
356