Вы находитесь на странице: 1из 6

2013 5th International Conference on Information and Communication Technology for the Muslim World.

Classification of EEG Signals Using MLP based on


Categorical and Dimensional Perceptions of Emotions

Hamwira Yaacob Wahab Abdul Norhaslinda Kamaruddin


Kulliyyah of Information & Kulliyyah of Information & Faculty of Computer and
Communication Technology, Communication Technology, Mathematical Sciences, MARA
International Islamic University International Islamic University University of Technology (UiTM),
Malaysia, P.O. Box 10, 50728 Kuala Malaysia, P.O. Box 10, 50728 Kuala 40400 Shah Alam, Selangor,
Lumpur, Malaysia. Lumpur, Malaysia. Malaysia.
hyaacob@iium.edu.my abdulwahab@iium.edu.my norhaslinda@tmsk.uitm.edu.my

Abstract Emotions are frequently studied based on two of basic emotions which were identified as the byproduct of
approaches; categorical and dimensional. In this study, Multi- various stimulation modalities. For example, anger, disgust,
Layer Perceptron (MLP) was employed to classify four affective fear, happy, sad, and surprise were identified through visual
states as posited from these approaches. It was observed that
emotional states viewed from the dimensional perspective are stimuli, as described in [12]. In [13], acceptance, anger,
well discriminated using memory test. In addition to that, the anticipation, disgust, joy, fear, sadness, and surprise were
dynamic for each of the four emotions were also presented, in extracted through facial expressions. Many other works have
which it was also indicated that an emotional state does not occur produced different set of basic emotions [14][15].
abruptly. 2) Dimensional approach: From a different perspective,
emotions are also postulated to be organized based on a set of
KeywordsEmotion recognition; EEG; discrete emotion
few fundamental dimensions dimensions that compose an
I. INTRODUCTION affective space model (ASM). Historically, the dimensions of
Studies of emotions had been widely viewed as being an ASM were derived from evaluations of semantic
traditionally researched in psychology discipline. According to differential scales as implemented in [16]. Semantic
the theories of basic emotions, emotions are categorical. On the differential is one of the techniques used to measure subjective
other hand, from the dimensional perspective, different variables. In a classical semantic differential, adjective-word
emotions are posited to be elucidated by the same set of similar pairs were grouped into three major dimensions of word
features. meanings, namely evaluative, potency and activity
Due to the importance of emotions [1][2][3][4][5][6][7], dimensions. Each of the subjective variables was evaluated
several modalities [8] have been identified that response to the based on arbitrary scale associated to each of the bipolar
emotional stimuli, including the brain discharge. adjective words.
Electroencephalography (EEG) is widely employed to capture
brain discharge during neural activities. In many studies,
supervised learning approach is adapted to learn and classify
the brain signals using various machine learning algorithms
including back-propagation neural network [9], fuzzy-logic
[10] and agent learning [11].
A. Theories of Emotions
In most studies, emotions are commonly viewed as either
being categorical or dimensional. Many models have been
produced to support the hypothesis.
1) Categorical approach: As viewed from the categorical
approach, emotions proposed that humans are instilled with a
set of basic emotions. In other words, each emotion is
governed by a unique system that produces different
Fig. 1. Circumplex Model of Affect from [5], whereby valence
psychological and physiological manifestations. Over the is scaled on the x-axis and arousal is on y-axis
years, different researchers have come up with different lists

978-1-4799-0136-4/13/$31.00 2013 IEEE


As proposed in [17], a two-dimensional ASM (also referred containing nodes representing the number of features in
as circumplex model of affect) consists of valence and arousal training data set. The second layer may consist of numerous
respectively denoting the x-axis and the y-axis in a Cartesian hidden layers. The number of hidden layers and the containing
coordinate system. A particular emotional state is plotted based units are arbitrary. The third layer is the output layer.
on the level of valence and intensity of arousal as portrayed in
Fig. 1. For an example, the artificial neural network depicted in
Fig. 2 consists of one input layer, one hidden layer and one
B. Electroencephalogram (EEG)
Electroencephalogram (EEG) is an imaging tool that
captures electrical activation occurrences. A human brain is
composed of a massive number of interconnected neurons that
process sensory signals received from other parts of the body
and produce appropriate responses.
A neuron transmits electrical impulses containing messages
to the neighbouring neurons from the source of stimulation to
the cerebral cortex. Electrical impulses from the neighboring
neuron groups that reach the cerebral cortex can be captured by
the sensory electrodes placed at the head scalp. After the
amplification of the electrical potentials, EEG signals are
measured based on the potential differences between pairs of
electrodes [18], which are placed in accordance to any of the
international standard system for EEG positioning.
There are many advantages of using EEG as the neuro-
imaging technique [19][20]. Comparing to other tools, such as
functional magnetic resonance imaging (fMRI), magneto-
encephalography (MEG), magnetic resonance imaging (MRI),
magnetic resonance spectroscopy (MRS), near infrared Fig. 2. Multi-Layer Perceptron network
spectroscopy/optimal imaging (NIRS), positron emission
tomography (PET) and single photon emission computed
tomography (SPECT), EEG produces signals that contain ooutput layer. The input layer contains four nodes. Each of the
higher temporal resolution. Moreover, EEG is considered to be nodes is connected to four nodes in hidden layer, which are
a safe and non-invasive apparatus. Due to the insensitivity of also connected to the node in output layer. The connecting lines
EEG to motion, it is considered easy to be with developmental carry some weights which represent the amount of electrical
populations, such as infants and children. impulses in the neurons. The weights are frequently learned
and updated during the training through back-propagation
Brainwaves which are represented by EEG signals are learning.
commonly ranged into four bands; delta, theta, alpha and beta.
Each of the frequency bands were observed as products of At each node in the hidden layer, sum of the products from
different brain tasks, as some depicted in Table I. each input node and the corresponding weight is calculated as
the input for a transfer function (activation function).
TABLE I. FOUR BASIC EEG BANDS AND DESCRIPTIONS
A transfer function is used to simulate the excitation (or
Bands Frequency Descriptions inhibition) of the electrical impulses. The following table
Delta 0.5 Hz - 4 Hz Characteristic of deep sleep phases shows some common transfer functions used in MLP:
[21][22]
Theta 4 Hz - 8 Hz Drowsiness and fatigue due to
monotonous tasks [23], TABLE II. TRANFER FUNCTIONS FOR MLP
control of working memory process
[24][25], Transfer
Plots
associated with emotional (arousal [26]; Functions
approach-avoidance [27]) Step function
Alpha 8 Hz -13 Hz Cognitive control [28],
creative thinking [29],
associated with emotion (disorders [30];
recognition [31])
Beta 13 Hz 30 Hz Alertness [32],
Linear function
phonological tasks [33].

C. Multi Layer Perceptron (MLP)


Multi-layer perceptron (MLP) is an artificial neural
network technique which is inspired by the way how neurons Sigmoid function
work in the brain. Using the technique, a model is constructed
from the learning of training data.
An artificial neural network adapting MLP consists of three
layers. Each layer contains several numbers of nodes,
replicating the neurons. The first layer is the input layer
II. METHODOLOGY adapted. As a result, 50 features were extracted for each
A. Input data processing instance.
B. Training
In this work, MLP was employed to learn and construct the
classifiers. The training was done in two phases based on
different emotional theories:
1) Based on categorical approach: For the categorical
approach, each of the training instances was labeled by a set of
four dependent features representing four basic emotions. The
label for fear is denoted by set {0, 0, 0, 1}; for happy, {0, 0, 1,
0}; for sad, {0, 1, 0, 0}; and for calm, {1, 0, 0, 0}.
2) Based on dimensional approach: For the dimensional
approach, the classes were labeled based on the generalized
values of valence and arousal. The values were assigned based
on quadrants in which each emotion is located in the affective
space model, as shown in Fig. 5.

Fig. 3. Flow chart of the methodology

1) Signal acquisition Fig. 5. Generalized affective space model


Subjects were 9 children of both genders aged between 4 to
6 years old. Informed consents were obtained from the parents Happy emotional state is indicated as having positive
prior to experiment. intensity of arousal and positive level of valence. Fear emotion
During the experiment, each subject was instructed to watch corresponds to negative valence and positive arousal. Sad
4 sets of 10 face images extracted from Radboud Faces emotion is located at the quadrant of negative valence and
Database (RafD) [34] corresponding to basic emotions of
negative arousal. Quadrant of positive valence and negative
happy, fear, sad and calm. Each of the sets was displayed and
arousal represents the calm emotional state.
observed for 1 minute.
For both modules, the followings network parameters were
adapted:

TABLE III. PARAMETERS FOR MLP


Parameters Values
Number of hidden layer 1
Number of nodes in hidden layer 35
Mean-square error goal 0.2
Fig. 4. Stimuli presentation flow Activation function at hidden layer Tan-sigmoid
Activation function at output layer Pure linear
The International 10-20 system is adapted for positioning 8 C. Testing
channels of EEG electrodes covering the lobes of the brain
including frontal lobe, temporal lobe, parietal lobe and As a result, 16 classifiers (8 from the categorical approach;
occipital lobe. 8 from the dimensional approach) representing EEG electrodes
were constructed for each subject. Memory test and blind test
2) Signal pre-processing: At the pre-processing phase, were performed for all of the classifiers.
signals were filtered to eliminate noises and irrelevant
1) Memory test: For the memory test, each of the 8
artifacts. In addition, only activities captured after the first 5
classifiers was executed to classify each emotional state of the
seconds until the 55th second in the recorded signals were
corresponding subjects.
used for the training.
2) Blind test: For the blind test, the classifiers were
In this study, elliptic bandpass filtering method was
executed to classify emotional states of other subjects.
employed to extract theta and alpha bands containing
emotional artifacts.
3) Feature Extraction: To extract features from the
filtered signals, Kernel Density Estimation (KDE) was
III. RESULTS highest classification rate of 0.09 on the same emotional state
Average of the classification results are summarized in a through the blind test. In average, channel classifier C3
chart as depicted in Fig. 6. The horizontal axis corresponds to categorized all four emotions at 0.12 accuracy rate. The highest
the EEG channels. While the vertical axis denoting the classification average through memory test was performed by
performance accuracy ranging from 0 to 1. Due to the wide gap channel classifier T8 with the accuracy of 14.74 %.
between accuracy rates, the range from 0.3 to 0.85 was On the other hand, using the blind test method, channel
discarded. classifier C3 has produced the highest classification rate of
A. Categorical approach 6.04 %. However, this rate is just slightly higher than the
classification performance of channel classifier T8 with
The bar graphs correspond to the outputs produced by percentage of 6.02 %
classifiers from categorical approach. Darker colors were used
for the memory test bars and lighter colors were applied on the In general, the accuracies produced by each channel
blind test bar columns. Bar columns of the memory tests and classifier to classify different emotional state through memory
the blind test are placed side-by-side in accordance to the tests outperformed the outcomes derived through the blind
corresponding emotional states. For the classifier of channel tests.
C3, the average accuracy of 9 subjects on fear emotional state B. Dimensional approach
produced through the memory test is approaching 0.15.
However, the when performing a blind test, the average For the dimensional approach, the outputs are denoted by
accuracy is in between 0.05 to 0.1. line plots. The solid lines are used to represent the average
classification accuracy rates through the memory tests, while
Next, as indicated by the dark blue bar column, the average the dotted lines are used for displaying the accuracies derived
accuracy of classifier channel C3 on happy emotion using from the blind tests.
memory test, is slightly higher than the average accuracy in the
blind test on fear emotion. The performance of the classifier Through the memory tests, all channel classifiers which
dropped to below 5% accuracy when implementing in the blind were constructed based on the dimensional approach have
test. About the same performances are observed in classifying produced classification accuracies of higher than 90%. The
sad emotional state on the same classifier for memory test and performances are consistent throughout different emotional
blind test, respectively. The accuracy of this classifier to states with an average of 94.54 % accuracy.
categorize calm emotions is significantly higher than other, On the other hand, through the blind tests, the average
which is almost 20%. The same sequence is also applied for accuracy of each classifier on different subjects ranging
other channel classifiers in the plot. between 20% and 25% across different emotional states. The
The highest average accuracy rate through memory test is average classification accuracy of different channel classifiers
18.67 %, which was recorded from channel classifier C3 on is 21.82 %, which is consistent with the performance of
calm emotion. Channel classifier C3 has also performed the individual classifiers.

Fig. 6. Average Classifications of different Classifiers from categorical and dimensional approaches
Fig. 7. Emotional dynamics based on valence and arousal

C. Dynamic of emotions However, the discriminative power of the classifiers only


As indicated, classifiers constructed by the dimensional works on same subjects of whom the EEG signals were
approach have produced higher accuracy rates which are above captured. The classifiers do not work well on other subjects.
the occurrences of by-chance probabilities. The highest Therefore, for the future work, the affective classifiers should
accuracy for classifying fear emotional state was produced by be design based on the profiling model of different subjects.
channel classifier P3 at 95.43 % discriminative power. In In addition to that, it was also observed that when a
addition, happy emotion was recognized the highest by channel stimulus was displayed to a subject, the targeted emotional
classifier T8 at classification rate of 0.95 accuracy. At the state was not abruptly accomplished. It indicates that precursor
performance accuracy of 95.3 %, channel classifier P4 was the of an emotional onset may be investigated from EEG signals.
highest to perform classification on sad emotion. For calm The capabilities of understanding the dynamic of emotions are
emotional state, F3 was recorded to produce the highest useful for researchers and practitioners in psychiatric field, in
accuracy at 95.59 %. which the predictive performance can be improved and
Based on the highest classification accuracy rates understanding of the emotion onset causes may be enhanced.
performed by different classifiers on EEG signals extracted ACKNOWLEDGMENT
from the first 5 seconds and lasting for 50 seconds, the
dynamic of each emotional state is calculated using the valence We would like to thank Bjorn Cruts from Biometrisch
and arousal values produced by each classifier on the entire Centrum http://www.biometrischcentrum.nl/ for sponsoring
time frame of emotional experience. the EEG device and technical support for the project.
The dynamics of different emotional states are depicted in
Fig. 7. In each scatter plot, the corresponding emotional state is REFERENCES
highlighted. For fear emotion, long total time period is covered
by the targeted emotion, therein positive arousal intensity and [1] D. Derks, A. H. Fischer, and A. E. R. Bos, The role of emotion in
computer-mediated communication: A review, Comput. Hum. Behav.,
negative level of valence. In the happy emotion dynamic plot, vol. 24, no. 3, pp. 766785, 2008.
almost the entire period indicate positive intensity and valence. [2] N. N. Oosterhof and A. Todorov, Shared perceptual basis of emotional
In a sad situation, the experience is alternately replaced expressions and trustworthiness impressions from faces, emotion, vol.
9, no. 1, pp. 128133, 2009.
with happy and calm emotional states. For calm emotion, very
[3] M. Zeelenberg, R. M. A. Nelissen, S. M. Breugelmans, and R. Pieters,
little occurrences of negative arousal and positive valence were On emotion specificity in decision making: Why feeling is for doing,
observed. Judgm. Decis. Mak., vol. 3, no. 1, pp. 1827, 2008.
[4] S. DMello and A. Graesser, Dynamics of affective states during
IV. CONCLUSION complex learning, Learn. Instr., vol. 22, no. 2, pp. 145157, Apr. 2012.
[5] E. A. Phelps, Human emotion and memory: interactions of the
It was observed that each of EEG channel classifiers amygdala and hippocampal complex, Curr. Opin. Neurobiol., vol. 14,
learned from the dimensional approach perform better than the no. 2, pp. 198202, Apr. 2004.
ones constructed based on the categorical approach. Since each [6] R. M. Nesse, Natural selection and the regulation of defenses, Evol.
channel classifier representing different EEG nodes at different Hum. Behav., vol. 26, no. 1, pp. 88105, 2005.
locations on the brain scalp, the findings have strengthened the [7] S. Vanheule, M. Desmet, R. Meganck, and S. Bogaerts, Alexithymia
hypothesis that all emotions are governed by a set of similar and interpersonal problems, J. Clin. Psychol., vol. 63, no. 1, pp. 109
neural systems. Furthermore, the quantification scale adapted 117, Jan. 2007.
from the dimensional approach is seen to suit well in [8] I. B. Mauss and M. D. Robinson, Measures of emotion: A review,
discriminating different emotional states. Cogn. Emot., vol. 23, no. 2, pp. 209237, 2009.
[9] C. T. Yuen, W. S. San, T. C. Seong, and M. Rizon, Classification of [24] P. Sauseng, B. Griesmayr, R. Freunberger, and W. Klimesch, Control
Human Emotions from EEG Signals using Statistical Features and mechanisms in working memory: a possible function of EEG theta
Neural Network, Int. J. Integr. Eng., vol. 1, no. 3, Jun. 2011. oscillations., Neurosci. Biobehav. Rev., vol. 34, no. 7, pp. 10151022,
[10] Q. Zhang and M. Lee, Analyzing the Dynamics of Emotional Scene Jun. 2010.
Sequence Using Recurrent Neuro-Fuzzy Network, in Neural [25] G. Vecchiato, F. Babiloni, L. Astolfi, J. Toppi, P. Cherubino, J. Dai, W.
Information Processing, vol. 7064, B.-L. Lu, L. Zhang, and J. Kwok, Kong, and D. Wei, Enhance of theta EEG spectral activity related to the
Eds. Springer Berlin / Heidelberg, 2011, pp. 340347. memorization of commercial advertisings in Chinese and Italian
[11] Heraz, R. Razaki, and C. Frasson, Using machine learning to predict subjects, in Biomedical Engineering and Informatics (BMEI), 2011 4th
learner emotional state from brainwaves, Adv. Learn. Technol. 2007 International Conference on, 2011, vol. 3, pp. 1491 1494.
Icalt 2007 Seventh Ieee Int. Conf., no. Table 1, pp. 853857, 2007. [26] M. Y. V. Bekkedal, J. Rossi III, and J. Panksepp, Human brain EEG
[12] P. Ekman, Are there basic emotions?, 1992. indices of emotions: Delineating responses to affective vocalizations by
measuring frontal theta event-related synchronization, Neurosci.
[13] R. Plutchik, The Nature of Emotions, Am. Sci., vol. 89, no. 4, p. 344, Biobehav. Rev., vol. 35, no. 9, pp. 19591970, Oct. 2011.
2001.
[27] P. Neo and N. McNaughton, Frontal theta power linked to neuroticism
[14] J. Panksepp, Affective Neuroscience: The Foundations Of Human And and avoidance, Cogn. Affect. Behav. Neurosci., vol. 11, no. 3, pp. 396
Animal Emotions. Oxford University Press, 2004. 403, 2011.
[15] S. S. Tomkins, Affect Imagery Consciousness: The Complete Edition. [28] B. Zoefel, R. J. Huster, and C. S. Herrmann, Neurofeedback training of
Springer Publishing Company, 2008.
the upper alpha frequency band in EEG improves cognitive
[16] A. Mehrabian and J. A. Russell, An approach to environmental performance., neuroimage, vol. 54, no. 2, pp. 14271431, 2011.
psychology. Cambridge, MA, US: The MIT Press, 1974.
[29] A. Fink, B. Graif, and A. C. Neubauer, Brain correlates underlying
[17] J. A. Russell, A circumplex model of affect., J. Pers. Soc. Psychol., creative thinking: EEG alpha activity in professional vs. novice
vol. 39, no. 6, pp. 11611178, 1980. dancers, neuroimage, vol. 46, no. 3, pp. 854862, 2009.
[18] S. Baillet, J. C. Mosher, and R. M. Leahy, Electromagnetic brain [30] A. H. Kemp, K. Griffiths, K. L. Felmingham, S. A. Shankman, W.
mapping, Signal Process. Mag. Ieee, vol. 18, no. 6, pp. 14 30, Nov. Drinkenburg, M. Arns, C. R. Clark, and R. A. Bryant, Disorder
2001. specificity despite comorbidity: Resting EEG alpha asymmetry in major
[19] B. J. Casey and M. De Haan, Introduction: new methods in depressive disorder and post-traumatic stress disorder, Biol. Psychol.,
developmental science, Dev. Sci., vol. 5, no. 3, pp. 265267, 2002. vol. 85, no. 2, pp. 350354, Oct. 2010.
[20] S. Sanei and J. A. Chambers, EEG Signal Processing, 1st ed. Wiley- [31] M. Murugappan, R. Nagarajan, and S. Yaacob, Appraising human
Interscience, 2007. emotions using Time Frequency Analysis based EEG alpha band
[21] K. umkov, Human sleep and sleep EEG, Meas. Sci. Rev., vol. 4, features, in Innovative Technologies in Intelligent Systems and
no. 2, pp. 5974, 2004. Industrial Applications, 2009. CITISIA 2009, 2009, pp. 70 75.
[22] M. H. Silber, S. Ancoli-Israel, M. H. Bonnet, S. Chokroverty, M. M. [32] J. Kamiski, A. Brzezicka, M. Gola, and A. Wrbel, Beta band
Grigg-Damberger, M. Hirshkowitz, S. Kapen, S. A. Keenan, M. H. oscillations engagement in human alertness process, Int. J.
Kryger, T. Penzel, M. R. Pressman, and C. Iber, The visual scoring of Psychophysiol., vol. 85, no. 1, pp. 125128, Jul. 2012.
sleep in adults, J. Clin. Sleep Med. Jcsm Off. Publ. Am. Acad. Sleep [33] B. Penolazzi, C. Spironelli, C. Vio, and A. Angrilli, Brain plasticity in
Med., vol. 3, no. 2, pp. 121131, Mar. 2007. developmental dyslexia after phonological treatment: A beta EEG band
[23] B. T. Jap, S. Lal, P. Fischer, and E. Bekiaris, Using EEG spectral study, Behav. Brain Res., vol. 209, no. 1, pp. 179182, May 2010.
components to assess algorithms for detecting fatigue, Expert Syst. [34] O. Langner, R. Dotsch, G. Bijlstra, D. H. J. Wigboldus, S. T. Hawk, and
Appl., vol. 36, no. 2, Part 1, pp. 23522359, Mar. 2009. A. van Knippenberg, Presentation and validation of the Radboud Faces
Database, Cogn. Emot., vol. 24, no. 8, pp. 13771388, 2010.

Вам также может понравиться