Wairagkar Maitreyee
Vaidyanathan Ravi
This dataset consists of EEG recorded during visual human-robot interaction from 10 healthy participants to investigate the emotive response in EEG to different robot facial expressions. Participants observed four different facial expressions (angry, happy, sad and surprised along with neutral expression) displayed by the social robot Miko on its digital screen. EEG was recorded from 16 unipolar channels in frontal, central, temporal, parietal, and occipital locations . During each trial, an emotion stimulus was displayed for approximately 4s followed by 4s break during which the Miko robot displayed neutral expression and blinked regularly. Emotions were displayed in random order. Total of 240 EEG trials were recorded from each participant with 60 trials per emotion. The dataset provides raw minimally filtered EEG along with cleaned EEG with artefacts removal using ICA with sampling frequency of 128 Hz, and corresponding stimulus onset markers. Please refer to README file for further details and example code.
Please cite the original publication:
M. Wairagkar et al., "Emotive Response to a Hybrid-Face Robot and Translation to Consumer Social Robots," IEEE Internet of Things Journal, DOI: 10.1109/JIOT.2021.3097592.
Preprint:
M. Wairagkar et al., "Emotive Response to a Hybrid-Face Robot and Translation to Consumer Social Robots," arXiv:2012.04511