Академический Документы
Профессиональный Документы
Культура Документы
Referente
Metodologa Objetivos Justificacin
Terico
Resultado
Cronograma Presupuesto Bibliografa
Esperado
Contexto
Determinar caractersticas poblacionales en una muestra de estudiantes de
pregrado de la UAM, por medio de la caracterizacin de expresiones
faciales espontneas que permiten identificar respuestas emocionales; para
establecer el perfil de los participantes activos en esta iniciativa.
Anlisis automatizado del comportamiento no verbal (comportamiento facial); (Pantic, 2009); (M.
Valstar, 2006) expresiones faciales espontneas.
Estudio de la geometra de la cara, puntos de referencia, textura de la piel y lneas muy marcadas (G.
Zhao, 2007).
Caractersticas geomtricas y la apariencia; son dos formas de analizar expresiones faciales
BD Relacional
Datos Personales
Protocolos de estimulacin
de emociones
Emociones identificadas
Monitoreo
Constante
- Casos de Uso
- Diagramas de colaboracin
- Diagramas de secuencia
Cierre cada 2 semanas
Resultados peridicos
(Entrega continua)
Intervencin poblacin objetivo
Resultado Esperado
Un sistema computacional inteligente para la caracterizacin emocional
en los estudiantes de la UAM, por medio de la identificacin
automatizada de expresiones faciales para la determinacin de
emociones.
evocados visuales
Intervencin de la poblacin objetivo x
x
Caracterizacin emocional de la poblacin
x
Validacin del sistema computacional
Redaccin del informe final x
Presupuesto
Valor
Rubro Nombre Descripcin
Hardware Computador Porttil Equipo de cmputo para realizar la documentacin, el $2.700.000
desarrollo y las pruebas del proyecto.
Electroencefalograma Desconozco
[2] Pang Y., Zhang L., Li M., Liu Z., and Ma W. A novel Gabor-LDA based face recognition method. Lecture Notes
in Computer Science, vol. 3331, 2004, pp. 352-358.
[3] Vukadinovic D. and Pantic M. Fully Automatic Facial Feature Point Detection Using Gabor Feature Based
Boosted Classifiers. IEEE International Conference on Systems, Man and Cybernetics Waikoloa, Hawaii October
10- 12, 2005, pp:1692-1698.
Arman Savran, H. C. (2012). Combining video, audio and lexical indicators of affect in spontaneous
conversation via particle filtering (2012 ed.). Proc ACM Int Conf Multimodal Interact. 2012.
doi:10.1145/2388676.2388781
B. Jiang, M. V. (2011). Action unit detection using sparse appearance descriptors in space-time video
volumes. London: Proceedings of IEEE International Conference on Automatic Face and Gesture
Bibliografa
Mase, K.: Recognition of facial expression from optical flow. IEICE Trans. E 74,34743483 (1991).
D. Keren, M. Osadchy, and C. Gotsman, Antifaces: A Novel, Fast Method for Image Detection, IEEE Trans. Pattern
Analysis and Machine Intelligence, vol. 23, no. 7, pp. 747-761, July 2001.
L. Ma and K. Khorasani, Facial Expression Recognition Using Constructive Feedforward Neural Networks, IEEE
TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICSPART B: CYBERNETICS, VOL. 34, NO. 3, JUNE
2004.
Essam Al Daoud, Enhancement of the Face Recognition Using a Modified Fourier-Gabor Filter, Int. J. Advance.
Soft Comput. Appl., Vol. 1, No. 2, November 2009 ISSN 2074-8523; Copyright ICSRS Publication, 2009 www.i-
csrs.org.
Anissa Bouzalmat, Naouar Belghini, Arsalane Zarghili, Jamal Kharroubi & Aicha Majda, Face Recognition Using
Neural Network Based Fourier Gabor Filters & Random Projection, International Journal of Computer Science and
Security (IJCSS), Volume (5) : Issue (3), 2011
Bibliografa
Mahesh Kumbhar, Ashish Jadhav, and Manasi Patil, Facial Expression Recognition Based on Image
Feature, International Journal of Computer and Communication Engineering, Vol. 1, No. 2, July 2012.
Freeman, W.T. and Adelson, E.H. 1991. The design and use of steer- able filters. IEEE Transactions on
Pattern Analysis and Machine Intelligence, 13(9):891906.
G.U. Kharat1 and S.V. Dudul2, Emotion Recognition from Facial Expression Using Neural Networks, Z.S.
Hippe and J.L. Kulikowski (Eds.): Human-Computer Sys. Intera., AISC 60, pp. 207219.
Daniela Sammler, M. G. (2007). Music and emotion: electrophysiological correlates of the processing of
pleasant and unpleasant music. (Vol. 44). PSYCHOPHYSIOLOGY. doi:10.1111/j.1469-
8986.2007.00497.x
Davidson RJ, F. N. (1983). Asymmetrical brain activity discriminates between positive and negative
affective stimuli in human infants.
G. Zhao, M. P. (2007). Dynamic texture recognition using local binary patterns with an application to
facial expressions (Vol. 29). Finland: IEEE Transactions on Pattern Analysis and Machine Intelligence.
Recuperado el 27 de 04 de 2017
Bibliografa
Jeremy N. Bailensona, ,. ,. (2008). Real-time classification of evoked emotions using facial feature
tracking and physiological responses (Vol. 66).
M. Valstar, M. P. (2006). Spontaneous vs. posed facial behavior: automatic analysis of brow actions.
Netherlands: Proceedings of ACM International Conference on Multimodal Interfaces.
Nancy Aaron Jones, N. A. (1992). Electroencephalogram asymmetry during emotionally evocative films
and its relation to positive and negative affectivity (Vol. 20). Elsevier. Recuperado el 27 de 04 de 2017,
de https://doi.org/10.1016/0278-2626(92)90021-D
Nitin Kumar, K. K. (2016). Bispectral Analysis of EEG for Emotion Recognition (Vol. 84). Assam:
Biomimetic and Cognitive Robotics Lab, Computer Sc & Engineering, Tezpur University, Napaam,
Sonitpur, 784028, Assam, India. Obtenido de https://doi.org/10.1016/j.procs.2016.04.062
P.Ekman, K. S. (1984). Expression and the nature of emotion. California, San Francisco: Hillsdale,
NL:Lawrence Erlbaum.
Bibliografa
Pantic, M. (2009). Machine analysis of facial behaviour: naturalistic and dynamic behaviour.
S. Koelstra, I. P. (2013). Fusion of facial expressions and EEG for implicit affective tagging (2 ed.).
London: Image Vision Comput.
Sebe, N., Cohen, I., Gevers, T., & Huang, T. S. (2005). Multimodal approaches for emotion recognition: a
survey (Vol. 5670). Internet Imaging VI, 56. doi:10.1117/12.600746
Trainor, L. A. (2001). Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical
emotions. Canada: COGNITION AND EMOTION. doi:10.1080/02699930126048
Zhihong Zeng, M. P. (2008). A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous
Expressions (Vol. 31). IEEE Transactions on Pattern Analysis and Machine Intelligence.
doi:10.1109/TPAMI.2008.52
Hongying Meng, D. H.-S. (2013). Depression recognition based on dynamic facial and vocal expression
features using partial least square regression. Barcelona. doi:10.1145/2512530.2512532