International journal of

ADVANCED AND APPLIED SCIENCES

EISSN: 2313-3724, Print ISSN:2313-626X

Frequency: 12

line decor
  
line decor

 Volume 5, Issue 8 (August 2018), Pages: 104-112

----------------------------------------------

 Original Research Paper

 Title: Profound correlation of human and NAO-robot interaction through facial expression controlled by EEG sensor

 Author(s): Ahmad Hoirul Basori 1, *, Mohamed Abdulkareem Ahmed 2, Anton Satria Prabuwono 1, 3, Arda Yunianta 1, 4, Arif Bramantoro 1, Irfan Syamsuddin 1, 5, Khalid Hamed Allehaibi 6

 Affiliation(s):

 1Faculty of Computing and Information Technology Rabigh, King Abdulaziz University, Jeddah, Makkah, Saudi Arabia
 2Tikkurila Oyj, Vantaa, Finland
 3Master in Computer Science Program, Budi Luhur University, Jakarta 12260, Indonesia
 4Faculty of Computer Science and Information Technology, Mulawarman University, Indonesia
 5CAIR - Center for Applied ICT Research, Department of Computer and Networking Engineering, School of Electrical Engineering Politeknik Negeri Ujung Pandang, Makassar, Indonesia
 6Faculty of Computing and Information Technology, King Abdulaziz University, Jeddah, Makkah, Saudi Arabia

 https://doi.org/10.21833/ijaas.2018.08.013

 Full Text - PDF          XML

 Abstract:

Emotion recognition from brain computer interface (EEG) has been studied extensively for the past few years. Time-frequency analysis is widely used in the past research; however, a variation of case study determines the brain signal analysis. In this paper, human emotion from brain waves is recognized in simple ways by calculating a frequency of signal variation. Entirely 35 healthy subjects from students with age 18-25 years old. The students are divided into 3 groups; the first group consists of 15 students; the second group consists of 10 students and the third group consists of 10 students. Each student takes 4 seconds to test his or her internal emotions. The signal speed is recorded during those 4 seconds. Based on stimulus time, various knocks for Z1 and Z2 is observed during a particular time. The experiment can be reproduced for in upcoming future by following the procedure. There are two main elements to measure signal speed which are ΔT and gap. ΔT subject to time differentiation of the changes in time-frequency of Alpha signals. For an evaluation of this work, there is an available benchmark database of EEG labeled with emotions; it mentions that emotional strength can be used as a factor to differentiate between human emotions. The results of this paper can be compared with previous researches which use the same device to differentiate between happy and sad emotions in terms of emotional strength. There is a strong correlation between emotional strength and frequency, we proved that sad feeling is speedier and beyond steady compared to happy since the number of ΔV to Z1 which represents sad emotion of Alpha signals is greater than ΔV to Z2 that represents a happy feeling in the same time period of the interaction process. 

 © 2018 The Authors. Published by IASE.

 This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

 Keywords: Facial expression, Brain computer interface, Emotion

 Article History: Received 25 March 2018, Received in revised form 10 June 2018, Accepted 11 June 2018

 Digital Object Identifier: 

 https://doi.org/10.21833/ijaas.2018.08.013

 Citation:

 Basori AH, Ahmed MA, Prabuwono AS et al. (2018). Profound correlation of human and NAO-robot interaction through facial expression controlled by EEG sensor. International Journal of Advanced and Applied Sciences, 5(8): 104-112

 Permanent Link:

 http://www.science-gate.com/IJAAS/2018/V5I8/Basori.html

----------------------------------------------

 References (29) 

  1. Ahmed MA and Loo CK (2014). Emotion recognition based on correlation between left and right frontal EEG assymetry. In the 10th France-Japan/8th Europe-Asia Congress on Mecatronics, IEEE, Tokyo, Japan: 99-103. https://doi.org/10.1109/MECATRONICS.2014.7018585   [Google Scholar] 
  2. Basori AH (2013). Emotion walking for humanoid avatars using brain signals. International Journal of Advanced Robotic Systems, 10(29): 1-11. https://doi.org/10.5772/54764   [Google Scholar] 
  3. Basori AH, Afif FN, Almazyad AS, AbuJabal HAS, Rehman A, and Alkawaz MH (2015). Fast markerless tracking for augmented reality in planar environment. 3D Research, 6(4): 41-52.   [Google Scholar]   
  4. Basori AH, Tenriawaru A, and Mansur ABF (2011). Intelligent avatar on E-learning using facial expression and haptic. TELKOMNIKA: Telecommunication Computing Electronics and Control, 9(1): 115-124. https://doi.org/10.12928/telkomnika.v9i1.677   [Google Scholar] 
  5. Crowley K, Sliney A, Pitt I, and Murphy D (2010). Evaluating a brain-computer interface to categorise human emotional response. In the IEEE 10th International Conference on Advanced Learning Technologies (ICALT), IEEE, Sousse, Tunisia: 276-278. https://doi.org/10.1109/ICALT.2010.81   [Google Scholar] 
  6. Ekman P and Friesen WV (1978). Facial action coding system consulting psychologists. Press Inc., Palo Alto, USA.   [Google Scholar] PMCid:PMC1183873     
  7. Fang Y, Chen M, and Zheng X (2015). Extracting features from phase space of EEG signals in brain–computer interfaces. Neurocomputing, 151: 1477-1485. https://doi.org/10.1016/j.neucom.2014.10.038   [Google Scholar] 
  8. Giorgi FS, Guida M, Caciagli L, Maestri M, Carnicelli L, Bonanni E, and Bonuccelli U (2014). What is the role for EEG after sleep deprivation in the diagnosis of epilepsy? Issues, controversies, and future directions. Neuroscience and Biobehavioral Reviews, 47: 533-548. https://doi.org/10.1016/j.neubiorev.2014.10.005   [Google Scholar]  PMid:25445183 
  9. Guo L, Rivero D, Dorado J, Rabunal JR, and Pazos A (2010). Automatic epileptic seizure detection in EEGs based on line length feature and artificial neural networks. Journal of Neuroscience Methods, 191(1): 101-109. https://doi.org/10.1016/j.jneumeth.2010.05.020   [Google Scholar]  PMid:20595035 
  10. Ibáñez-Molina AJ and Iglesias-Parro S (2014). Fractal characterization of internally and externally generated conscious experiences. Brain and Cognition, 87: 69-75. https://doi.org/10.1016/j.bandc.2014.03.002   [Google Scholar]  PMid:24709357 
  11. Ioannou SV, Raouzaiou AT, Tzouvaras VA, Mailis TP, Karpouzis KC, and Kollias SD (2005). Emotion recognition through facial expression analysis based on a neurofuzzy network. Neural Networks, 18(4): 423-435. https://doi.org/10.1016/j.neunet.2005.03.004   [Google Scholar]  PMid:15963691 
  12. Knyazev GG, Volf NV, and Belousova LV (2015). Age-related differences in electroencephalogram connectivity and network topology. Neurobiology of Aging, 36(5): 1849-1859. https://doi.org/10.1016/j.neurobiolaging.2015.02.007   [Google Scholar]  PMid:25766772 
  13. Koelstra S, Muhl C, Soleymani M, Lee JS, Yazdani A, Ebrahimi T, and Patras I (2012). Deap: A database for emotion analysis; using physiological signals. IEEE Transactions on Affective Computing, 3(1): 18-31. https://doi.org/10.1109/T-AFFC.2011.15   [Google Scholar] 
  14. Kubler A and Muller KR (2007). An introduction to brain-computer interfacing. In: Dornhege G (Ed.), Toward brain-computer interfacing: 1-25. The MIT Press, Cambridge, USA.   [Google Scholar]     
  15. Kumar A and Agarwal A (2014). Emotion recognition using anatomical information in facial expressions. In the 9th International Conference on Industrial and Information Systems, IEEE: 1-6. https://doi.org/10.1109/ICIINFS.2014.7036517   [Google Scholar] 
  16. Lam MC, Prabuwono AS, Arshad H, and Chan CS (2011). A real-time vision-based framework for human-robot interaction. In the International Visual Informatics Conference, Springer, Berlin, Heidelberg, Germany: 257-267. https://doi.org/10.1007/978-3-642-25191-7_25   [Google Scholar] 
  17. Liu Y, Sourina O, and Nguyen MK (2010). Real-time EEG-based human emotion recognition and visualization. In the International Conference on Cyberworlds, IEEE, Singapore: 262-269.   [Google Scholar]  
  18. Murugappan M, Rizon M, Nagarajan R, Yaacob S, Hazry D, and Zunaidi I (2008). Time-frequency analysis of EEG signals for human emotion detection. In the 4th Kuala Lumpur International Conference on Biomedical Engineering, Springer, Berlin, Heidelberg, Kuala Lumpur, Malaysia: 262-265. https://doi.org/10.1007/978-3-540-69139-6_68   [Google Scholar] 
  19. Noachtar S, Binnie C, Ebersole J, Mauguiere F, Sakamoto A, and Westmoreland B (1999). A glossary of terms most commonly used by clinical electroencephalographers and proposal for the report form for the EEG findings. The International Federation of Clinical Neurophysiology. Electroencephalography and Clinical Neurophysiology. Supplement, 52: 21-41.   [Google Scholar] PMid:10590974     
  20. Petrantonakis PC and Hadjileontiadis LJ (2010). Emotion recognition from brain signals using hybrid adaptive filtering and higher order crossings analysis. IEEE Transactions on Affective Computing, 1(2): 81-97. https://doi.org/10.1109/T-AFFC.2010.7   [Google Scholar] 
  21. Qian W, Meng Q, Chen L, and Zhou K (2012). Emotional modulation of the attentional blink is awareness-dependent. PloS One, 7(9): e46394. https://doi.org/10.1371/journal.pone.0046394   [Google Scholar] PMid:23029507  PMCid:PMC3459896     
  22. Riduwan M, Basori AH, and Mohamed F (2013). Finger-based gestural interaction for exploration of 3D heart visualization. Procedia-Social and Behavioral Sciences, 97: 684-690. https://doi.org/10.1016/j.sbspro.2013.10.288   [Google Scholar] 
  23. Rosas-Cholula G, Ramirez-Cortes JM, Alarcón-Aquino V, Martinez-Carballido J, and Gomez-Gil P (2010). On signal P-300 detection for BCI applications based on wavelet analysis and ICA preprocessing. In the Electronics, Robotics and Automotive Mechanics Conference (CERMA), IEEE, Morelos, Mexico: 360-365.   [Google Scholar]     
  24. Salazar-Varas R and Gutiérrez D (2015). An optimized feature selection and classification method for using electroencephalographic coherence in brain–computer interfaces. Biomedical Signal Processing and Control, 18: 11-18. https://doi.org/10.1016/j.bspc.2014.11.001   [Google Scholar] 
  25. Sebe N, Cohen I, Gevers T, and Huang TS (2006). Emotion recognition based on joint visual and audio cues. In the 18th International Conference on Pattern Recognition, IEEE, Hong Kong, China, 1: 1136-1139. https://doi.org/10.1109/ICPR.2006.489   [Google Scholar] 
  26. Song M, Bu J, Chen C, and Li N (2004). Audio-visual based emotion recognition - A new approach. In the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, IEEE, Washington, D.C., USA, 2: 1020-1025.   [Google Scholar]     
  27. Tian YI, Kanade T, and Cohn JF (2001). Recognizing action units for facial expression analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence, 23(2): 97-115. https://doi.org/10.1109/34.908962   [Google Scholar]  PMid:25210210 PMCid:PMC4157835 
  28. Yusoff YA, Basori AH, and Mohamed F (2013). Interactive hand and arm gesture control for 2d medical image and 3d volumetric medical visualization. Procedia-Social and Behavioral Sciences, 97: 723-729. https://doi.org/10.1016/j.sbspro.2013.10.293   [Google Scholar] 
  29. Zhang B, Wang J, and Fuhlbrigge T (2010). A review of the commercial brain-computer interface technology from perspective of industrial robotics. In the IEEE International Conference on Automation and Logistics (ICAL), IEEE, Hong Kong and Macau, China: 379-384. https://doi.org/10.1109/ICAL.2010.5585311   [Google Scholar]