Вы находитесь на странице: 1из 4

BlueEyes Technology and its Applications-Research post

BlueEyes and Its Applications


BlueEyes is the name of a human recognition venture initiated by IBM to allow pe
ople to interact with computers in a more natural manner. The technology aims to
enable devices to recognize and use natural input, such as facial expressions.
The initial developments of this project include scroll mice and other input dev
ices that sense the user's pulse, monitor his or her facial expressions, and the
movement of his or her eyelids.
I have prepared a 4-page handout on BlueEyes and Its Business Applications from
several cited sources.
BlueEyes and Its Applications
Animal survival depends on highly developed sensory abilities. Likewise, human c
ognition depends on highly developed abilities to perceive, integrate, and inter
pret visual, auditory, and touch information. Without a doubt, computers would b
e much more powerful if they had even a small fraction of the perceptual ability
of animals or humans. Adding such perceptual abilities to computers would enabl
e computers and humans to work together more as partners. Toward this end, the B
lueEyes project aims at creating computational devices with the sort of perceptu
al abilities that people take for granted.
How can we make computers "see" and "feel"?
BlueEyes uses sensing technology to identify a user's actions and to extract key
information. This information is then analyzed to determine the user's physical
, emotional, or informational state, which in turn can be used to help make the
user more productive by performing expected actions or by providing expected inf
ormation. For example, a BlueEyes-enabled television could become active when th
e user makes eye contact, at which point the user could then tell the television
to "turn on".
Most of us hardly notice the surveillance cameras watching over the grocery stor
e or the bank. But lately those lenses have been looking for far more than shopl
ifters.
Applications:
1.Engineers at IBM's ffice:smarttags" Research Center in San Jose, CA, report th
at a number of large retailers have implemented surveillance systems that record
and interpret customer movements, using software from Almaden's BlueEyes resear
ch project. BlueEyes is developing ways for computers to anticipate users' wants
by gathering video data on eye movement and facial expression. Your gaze might
rest on a Web site heading, for example, and that would prompt your computer to
find similar links and to call them up in a new window. But the first practical
use for the research turns out to be snooping on shoppers.
BlueEyes software makes sense of what the cameras see to answer key questions fo
r retailers, including, How many shoppers ignored a promotion? How many stopped?
How long did they stay? Did their faces register boredom or delight? How many r
eached for the item and put it in their shopping carts? BlueEyes works by tracki
ng pupil, eyebrow and mouth movement. When monitoring pupils, the system uses a
camera and two infrared light sources placed inside the product display. One lig
ht source is aligned with the camera's focus; the other is slightly off axis. Wh
en the eye looks into the camera-aligned light, the pupil appears bright to the
sensor, and the software registers the customer's attention.this is way it captu
res the person's income and buying preferences. BlueEyes is actively been incorp
orated in some of the leading retail outlets.
2. Another application would be in the automobile industry. By simply touching a
computer input device such as a mouse, the computer system is designed to be ab
le to determine a person's emotional state. for cars, it could be useful to help
with critical decisions like:
"I know you want to get into the fast lane, but I'm afraid I can't do that.Your
too upset right now" and therefore assist in driving safely.
3. Current interfaces between computers and humans can present information vivid
ly, but have no sense of whether that information is ever viewed or understood.
In contrast, new real-time computer vision techniques for perceiving people allo
ws us to create "Face-responsive Displays" and "Perceptive Environments", which
can sense and respond to users that are viewing them. Using stereo-vision techni
ques, we are able to detect, track, and identify users robustly and in real time
. This information can make spoken language interface more robust, by selecting
the acoustic information from a visually-localized source. Environments can beco
me aware of how many people are present, what activity is occuring, and therefor
e what display or messaging modalities are most appropriate to use in the curren
t situation. The results of our research will allow the interface between comput
ers and human users to become more natural and intuitive.
4. We could see its use in video games where, it could give individual challenge
s to customers playing video games.Typically targeting commercial business.
The integration of children's toys, technologies and computers is enabling new p
lay experiences that were not commercially feasible until recently. The Intel Pl
ay QX3 Computer Microscope, the Me2Cam with Fun Fair, and the Computer Sound Mor
pher are commercially available smart toy products developed by the Intel Smart
Toy Lab in . One theme that is common across these PC-connected toys is that use
rs interact with them using a combination of visual, audible and tactile input &
output modalities. The presentation will provide an overview of the interaction
design of these products and pose some unique challenges faced by designers and
engineers of such experiences targeted at novice computer users, namely young c
hildren.
5. The familiar and useful come from things we recognize. Many of our favorite t
hings' appearance communicate their use; they show the change in their value tho
ugh patina. As technologists we are now poised to imagine a world where computin
g objects communicate with us in-situ; where we are. We use our looks, feelings,
and actions to give the computer the experience it needs to work with us. Keybo
ards and mice will not continue to dominate computer user interfaces. Keyboard i
nput will be replaced in large measure by systems that know what we want and req
uire less explicit communication. Sensors are gaining fidelity and ubiquity to r
ecord presence and actions; sensors will notice when we enter a space, sit down,
lie down, pump iron, etc. Pervasive infrastructure is recording it. This talk w
ill cover projects from the Context Aware Computing Group at MIT Media Lab.
A researcher at Stanford has created an alternative to the mouse that allows a p
erson using a computer to click links, highlight text, and scroll simply by look
ing at the screen and tapping a key on the keyboard. By using standard eye-track
ing hardware--a specialized computer screen with a high-definition camera and in
frared lights--Manu Kumar, a doctoral student who works with computer-science pr
ofessor Terry Winograd, has developed a novel user interface that is easy to ope
rate. "Eye-tracking technology was developed for disabled users," Kumar explains
, "but the work that we're doing here is trying to get it to a point where it be
comes more useful for able-bodied users." He says that nondisabled users tend to
have a higher standard for easy-to-use interfaces, and previously, eye-tracking
technology that disabled people use hasn't appealed to them.
At the heart of Kumar's technology is software called EyePoint that works with s
tandard eye-tracking hardware. The software uses an approach that requires that
a person look at a Web link, for instance, and hold a "hot key" on the keyboard
(usually found on the number pad on the right) as she is looking. The area of th
e screen that's being looked at becomes magnified. Then, the person pinpoints he
r focus within the magnified region and releases the hot key, effectively clicki
ng through to the link.
Kumar's approach could take eye-tracking user interfaces in the right direction.
Instead of designing a common type of gaze-based interface that is controlled c
ompletely by the eyes--for instance, a system in which a user gazes at a given l
ink, then blinks in order to click through--he has involved the hand, which make
s the interaction more natural. "He's got the right idea to let the eye augment
the hand," says Robert Jacob, professor of computer science at Tufts University,
in Medford, MA.
Rudimentary eye-tracking technology dates back to the early 1900s. Using photogr
aphic film, researchers captured reflected light from subjects' eyes and used th
e information to study how people read and look at pictures. But today's technol
ogy involves a high-resolution camera and a series of infrared light-emitting di
odes. This hardware is embedded into the bezel of expensive monitors; the one Ku
mar uses cost $25,000. The camera picks up the movement of the pupil and the ref
lection of the infrared light off the cornea, which is used as a reference point
because it doesn't move.
Even the best eye tracker isn't perfect, however. "The eye is not really very st
able," says Kumar. Even when a person is fixated on a point, the pupil jitters.
So he wrote an algorithm that allows the computer to smooth out the eye jitters
in real time. The rest of the research, says Kumar, involves studying how people
look at a screen and figuring out a way to build an interface that "does not ov
erload the visual channel." In other words, he wanted to make its use feel natur
al to the user.
One of the important features of the interface, says Kumar, is that it works wit
hout a person needing to control a cursor. Unlike the mouse-based system in ubiq
uitous use today, EyePoint provides no feedback on where a person is looking. Pr
evious studies have shown that it is distracting to a person when she is aware o
f her gaze because she consciously tries to control its location. In the usabili
ty studies that Kumar conducted, he found that people's performance dropped when
he implemented a blue dot that followed their eyes.
3.
system designing will be adapted to your needs and what you want in a computer.
4. System overview mentions the configuration of your system in both hardware an
d also software

Вам также может понравиться