Вы находитесь на странице: 1из 28

EMOTION RECOGNITION

USING FACIAL EXPRESSIONS

Guided by, Presented by,


Remya Krishna J.S Jismy Jelson
Asst.Professor LTJE16EC028
Dept. of ECE S7 ECE
1
CONTENTS
 Introduction
 Emotions

 Working

 Classifiers

 Advantages

 Disadvantages

 Applications

 Future Scope

 Conclusion

 References 2
INTRODUCTION

 Emotion recognition is the process of


identifying human emotion, most typically
from facial expressions as well as from verbal
expressions
 extracting and understanding of emotion has a
high importance of the interaction between
human and machine communication.

3
WHAT IS EMOTION

4
HISTORY
 Tomkins 1964 Started

 In 1978 Paul Ekman introduced a system Facial


Action Coding System(FACT).

 Based on Paul Ekman's 6 Basic


Emotions.

5
USED ITEMS

 CAMERA

 MICROSOFT KINECT 3D SOFTWARE

 CLASSIFIER

6
WORKING

7
FEATURE EXTRACTION

 Appearance Features

 Geometric Features

8
GEOMETRIC FEATURE EXTRACTION

 A three-state lip model describes the lip state: open, closed,


tightly closed. A two-state model (open or closed) is used for
each of the eyes. Each brow and cheek has a one-state
model.

9
APPEARANCE FEATURE EXTRACTION

Gabor wavelets are widely used to extract the facial


appearance changes as a set of multi scale and multi
orientation coefficients. The Gabor filter may be applied to
specific locations on a face or to the whole face image.

10
ACTION UNITS

 AUs are considered to be the smallest visually


discernible facial movements.
 As AUs are independent of any interpretation,
they can be used as the basis for recognition of
basic emotions.
 However, both timing and the duration of
various AUs are important for the interpretation
of human facial behavior.
 It is an unambiguous means of describing all
possible movements of face in 46 action points .
11
appearance features. Geometric features

12
13
14
15
CLASSIFIERS

K-NN CLASSIFIER MLP CLASSIFIER

 K Nearest Neighbors  Multilayer Perception


 Simple algorithm  Include 3 layers
 Nonlinear classifier  Class of artificial neural
network
 96% accuracy
 90% accuracy

16
KNN CLASSIFIER

17
KNN classifier cont.…..

18
KNN classifier cont.…..

ADVANTAGES DISADVANTAGES

 Very simple  Choosing k is tricky


 Easy to understand  Expensive

 Effective  No training stage

 Sensitive to noise

19
ADVANTAGES

 Lower complexity

 Less computer demanding

 Sensitive

 Real time monitoring

 More secured

20
DISADVANTAGES
 Pose and Frequent head movements
 Presence of structural components

 Occlusion

 Image orientation

 Imaging conditions

 Subtle facial deformation

 Ambiguity and uncertainty in face motion


measurement

21
APPLICATIONS
 Predictive environments (Ambient Intelligence).
 More human-like human-computer, and human
robot
 interaction (e.g: emotional avatar).

 Emotional Mirror (Affective Computing).

 Treatment for people with psycho-affective

 illnesses (e.g: autism).

 Distance learning

 Robotics

22
 Emotion recognition in Health Care
 Automotive industry and emotion recognition

 Emotion recognition in video game testing

 Educational

23
24
SELF DRIVING
FUTURE SCOPE
 Handle partial occlusions better.
 Make it more robust (lighting conditions etc.)

 More person independent (fit mask


automatically).
 Use other classifiers (dynamics).

 Apply emotion recognition in applications. For


example games

25
CONCLUSION
 Facial emotion recognition has more challengs.by
using K-NN classifier we can reduce some of this.

 Developing technology’s using.

 Movements of head will effect the emotion


recognition.

26
REFERENCES
 International Conference on Computational
Science, ICCS 2017, 12-14 June 2017, Zurich,
Switzerland.
 P. Ekman. Emotions Revealed: Recognizing
Faces and Feeling to Improve Communication
and Emotional Life. Holt, 2003.
 Ratliff M. S., Patterson E., Emotion recognition
using facial expressions with active appearance
models, Proceedings of the Third IASTED
International Conference on Human Computer
Interaction, ACTA Press, Anaheim, CA, USA,
2008, 138–143.
 Etc………….. 27
28

Вам также может понравиться