Social Robots for Reinforcing Attention and Forming Emotional Knowledge of Children with Special Educational Needs

By Anna Lekova, Tanio Tanev, Violina Vassileva-Aleksandrova, Snejanka Kostova, Pancho Dachkinov, Omar Bouattane

Abstract


Emotional child-robot interaction helps catching quickly child attention and enhance information perception during learning and verbalization in children with Special Educational Needs (SEN). This will improve the pedagogical rehabilitation for these children and also develop their emotional knowledge and memory by play-like activities mediated by emotion-expressive social robots. The designed EEG-based portable Brain-Computer Interface (BCI) measures and features in real time the brain electrical activity in order to analyze the correlated attentional or emotional states of a child. The output performance scores are used either by special educators for assessment of the emotional states and cognitive performance of a child or as inputs for robot control in the play-learning scenarios. BCI is a new technology for human-robot interaction and it can evolve into technology for self-regulatory training of attention and emotional skills via neurofeedback exposed on the robot. This study marks a first step towards how to use the advantages of educational theater in Social Robotics to facilitate the perceptual processing of a child and at the same time to augment the emotional talent of an actor to an emotion-expressive robot. A low resolution EMOTIV brain-listening headset is used to translate the head/face movements or child brain activity into robot commands. Then, they are wirelessly transmitted to robot sensors, modules and controllers. Since the attention or emotional responses of children with SEN make robots to act, these skills are naturally reinforced in the play.

Full Text:

PDF

References


J. Gross and R. Thompson, “Emotion Regulation: Conceptual Foundations”. In J. Gross James (Ed.), Handbook of emotion regulation. New York, NY, US: Guilford Press. pp. 3-24. 2007.

Ch. Tyng et al, “The Influences of Emotion on Learning and Memory”, Frontiers in Psychology, vol.8, pp. 1-22, 2017.

S. Yamaguchi and K. Onoda, “Interaction between emotion and attention systems”, Frontiers in Neuroscience, vol. 6, 2012.

L. Pessoa, “Cognition and emotion”. Scholarpedia, 4(1):4567, [online document ], 2009. Available: http://www.scholarpedia.org/article/Cognition_and_emotion [Accessed Jan. 20, 2019].

B. Zikopoulos and H. Barbas, “Pathways for Emotions and Attention Converge on the Thalamic Reticular Nucleus in Primates”. Journal of Neuroscience, vol. 32 (15), pp. 5338-5350, 2012.

METEMSS Project, [Online], Available: http://ir.bas.bg/METEMSS/en/index.html. [Accessed Jan. 20, 2019].

P. MacIntyre and T. Gregersen, “Emotions that facilitate language learning: The positive-broadening power of the imagination”. Studies in Second Language Learning and Teaching, vol. 2, pp. 193-213, 2012.

D. Feuerriegel, O. Churches, J. Hofmann and H. Keage, “The N170 and face perception in psychiatric and neurological disorders: A systematic review”. Clinical Neurophysiology, vol. 126(6), pp. 1141–1158, 2015.

J. Walsh, S. Creighton and M Rutherford, “Emotion Perception or Social Cognitive Complexity : What Drives Face Processing Deficits in Autism Spectrum Disorder ?”. Journal of Autism and Developmental Disorders, vol. 46(2), pp. 615–623, 2016.

C. Breazeal, “Toward sociable robots”, Robotics and Autonomous Systems vol. 42, pp. 167–175, 2003.

M. Johnson-Glenberg at all, “Effects of Embodied Learning and Digital Platform on the Retention of Physics Content: Centripetal Force”, Frontiers in Psychology, vol. 7, pp. 1819, 2016

C. Breazeal, “Emotion and sociable humanoid robots”, International Journal of Human-Computer Studies, vol. 59, no. 1-2, pp. 119-155, 2003.

Z. Smyrnaiou, M. Sotiriou, E. Georgakopoulou and Ο. Papadopoulou, “Connecting Embodied Learning in educational practice to the realisation of science educational scenarios through performing arts”, In Proc. of Int. Conf. “Inspiring Science Education”, Athens ‘04, 2016, pp. 37-45.

H2020 Project CybSPEED, [Online], Available: https://cordis.europa.eu/project/rcn/212970_en.html

Educational Theater Tsvete, [Online], Available: http://theatretsvete.eu. [Accessed Jan. 20, 2019].

V. Vassileva-Aleksandrova, “Puppet Therapy in Education -Rehearsal for the Real Life”, October 2018, [Online], Available: http://theatretsvete.eu/?page_id=4984. [Accessed Jan. 20, 2019].

E. Jochum, J. Schultz, E. Johnson and TD. Murphey, “Robotic puppets and the eningeering of autonomous theater”. In: Laviers A, Egerstedt M (eds) Controls and art: inquiries at the intersection of the subjective and the objective. Springer, New York, pp 107–128, 2014.

E. Jochum, E. Vlachos, A. Chistoffersen, S. Nielsen, I. Hameed and Z. Tan “Using Theatre to Study Interaction with Care Robots”. International Journal of Social Robotics vol. 8, pp. 457-470. 2016

H. Triandis, “Cross-cultural perspectives on personality”. San Diego: Academic Press. pp. 439–464, 1997.

R. Ray, K. McRae, K. Ochsner and J. Gross, “Cognitive Reappraisal of Negative Affect: Converging Evidence From EMG and Self-Report”. Emotion, vol. 10, pp. 587–592, 2010

N. Krishnan et al, “Electroencephalography (EEG) Based Control in Assistive Mobile Robots: A Review”, In Proc. IOP Conferens: Materials Science and Engineering, 121 pp. 1-11, 2017.

MindtecStore Europe [Online], Available: https://www.mindtecstore.com/BrainExpress-Products [Accessed Jan. 20, 2019].

Emotiv EEG Neuroheadset, https://www.emotiv.com/

NAO humanoid robot. [Online], Available: https://www.softbankrobotics.com/emea/en/nao. [Accessed Jan. 20, 2019].

R. Smith, A. Alkozei and W. Killgore, "How do Emotions Work?". Frontiers for young minds, 2017. [Online serial]. Available: https://kids.frontiersin.org/article/10.3389/frym.2017.00069

F. Gall, "On the Functions of the Brain and of Each of Its parts: With Observations on the Possibility of Determining the Instincts, Propensities, and Talents, Or the Moral and Intellectual Dispositions of Men and Animals, by the Configuration of the Brain and Head, vol. 1." Marsh, Capen & Lyon. Originally published: 1835.

M. Minsky, “The Society of Mind”. Simon and Schuster, New York. 1986 .

F. Strumwasser, “The relations between neuroscience and human behavioral science.” J Exp Anal Behavior; vol. 61(2):307-17, 1994

The Royal Society, “Brain waves module 2: Neuroscience: implications for education and lifelong learning”, LondonThe Royal Society. [Online], 2011. Available: https://royalsociety.org/~/media/royal_society_content/policy/publications/2011/4294975733.pdf

C. Lim, T. Lee, C. Guan, D. Fung, Y. Zhao, S. Teng, H. Zhang and K. Krishnan, “A brain-computer interface based attention training program for treating attention deficit hyperactivity disorder”. PLoS One , vol. 7, no. 10, 2012.

D. Blandón, J. Munoz, D. Lopez, and O. Gallo, “Influence of a BCI neurofeedback videogame in children with ADHD. Quantifying the brain activity through an EEG signal processing dedicated toolbox”. In Proc. IEEE 11CCC ’04 2016, pp. 1-8.

M. Karimia, S. Haghshenasb, and R. Rostamic, “Neurofeedback and autism spectrum: A case study”, Procedia - Social and Behavioral Sciences vol. 30, pp. 1472 – 1475, 2011.

M Sterman, “Physiological origins and functional correlates of EEG rhythmic activities: implications for self-regulation“. Biofeedback Self Regul vol. 21, pp. 3–33, 1996.

S. Butnik, “Neurofeedback in adolescents and adults with attention deficit hyperactivity disorder”. J Clin Psychology vol. 61, pp. 621–625, 2005.

N. Lofthouse, I. Arnold, S. Hersch, E. Hurt and R. Debeus, “A Review of Neurofeedback Treatment for Pediatric ADHD”. Jornal Atten Disorders vol. 16, pp. 351–72, 2011.

H. Gevensleben, A. Rothenberger, GH. Moll, and H. Heinrich, “Neurofeedbackin children with ADHD: validation and challenges”. Expert Rev Neurother vol. 12, pp. 447–460, 2012.

S. Cramer et al, “Harnessing neuroplasticity for clinical applications”. Brain 134(6), pp. 1591–1609, 2011.

S. Banks, K. Eddy, M. Angstadt, P. Nathan, and K. Phan, “Amygdala–frontal connectivity during emotion regulation”. Social Cognitive and Affective Neuroscience, Vol. 2, pp. 303-312, 2007.

A. Rodríguez, B. Rey, M. Clemente, M. Wrzesien, and M. Alcañiz, “Assessing brain activations associated with emotional regulation during virtual reality mood induction procedures”. Expert Syst. Appl. vol. 42, 3, pp. 1699-1709, 2015.

Emotiv Brainware, “Understanding The Performance Metrics Detection Suite”. [Online], Available: https://emotiv.zendesk.com/hc/en-us/articles/201444095 [Accessed Jan. 20, 2019].

N. Badcock et al.,“Validation of the Emotiv EPOC EEG system for research quality auditory event-related potentials in children”, PeerJ vol. 3, pp. e907, 2015. [Online]. Available: https://doi.org/10.7717/peerj.90

P. Dachkinov, T. Tanev, A. Lekova, D. Batbaatar, and H. Wagatsuma, “Design and Motion Capabilities of an Emotion-Expressive Robot EmoSan”, In Proc. 10th IEEE Int. Conf. on Soft Computing and Intelligent Systems and 19th International Symposium on Advanced Intelligent Systems SCIS&ISIS2018, Toyama, Japan ‘12, 2018, pp. 1332-1338.

S. Shultz, A. Klin, and W. Jones, “Inhibition of eye blinking reveals subjective perceptions of stimulus salience”, Biological Sciences - Psychological and Cognitive Sciences - Social Sciences PNAS, 108 (52), pp. 21270-21275, 2011.

R Ramirez and Z. Vamvakousis, “Detecting Emotion from EEG Signals Using the Emotive Epoc Device”. In: Zanzotto F.M., Tsumoto S., Taatgen N., Yao Y. (eds) Brain Informatics. BI 2012. Lecture Notes in Computer Science, vol. 7670. Springer, Berlin, Heidelberg.

J. Malmivuo and R. Plonsey, “Bioelectromagnetism: Principles and Applications of Bioelectric and Biomagnetic Fields”. Oxford Univ, 1995.

L Aftanas. and S. Golocheikine, “Human anterior and frontal midline theta and lower alpha reflect emotionally positive state and internalized attention: high-resolution EEG investigation of meditation”, Neuroscience Letters, vol. 310, no. 1, pp. 57-60, 2001.

A., Muthusamy, et al, “A Narrative Speech, Gaze and Gesture Robot Accessing to Human Emotion and Memory by Using a Simultaneous Recording of EEG and Eye-Tracker System”, In Proc. of 12th Int. Conf. on Innovative Computing, Information and Control, Japan, ‘ 08 2017.

G Kalliatakis., A. Stergiou, and N. Vidakis, “Conceiving Human Interaction by Visualising Depth Data of Head Pose Changes and Emotion Recognition via Facial Expressions”. Computers, vol. 6, 25, 2017

MS Kinect, [Online], Available: https://developer.microsoft.com/en-us/windows/kinect. [Accessed Jan. 20, 2019].

S. O'Regan, S. Faul, and W. Marnane, “Automatic detection of EEG artefacts arising from head movements using EEG and gyroscope signals”, Medical Engineering and Physics, vol. 35 no. 7, pp. 867-874, 2013.