EmoKey Moments Muse EEG Dataset (EKM-ED): A Comprehensive Collection of Muse S EEG Data and Key Emotional Moments

  1. Francisco M. Garcia-Moreno 1
  2. Marta Badenes-Sastre 1
  1. 1 Universidad de Granada
    info

    Universidad de Granada

    Granada, España

    ROR https://ror.org/04njjy449

Argitaratzaile: Zenodo

Argitalpen urtea: 2020

Mota: Dataset

CC BY 4.0

Laburpena

EmoKey Moments Muse EEG Dataset (EKM-ED): A Comprehensive Collection of Muse S EEG Data and Key Emotional Moments   Dataset Description: The EmoKey Moments EEG Dataset (EKM-ED) is an intricately curated dataset amassed from 47 participants, detailing EEG responses as they engage with emotion-eliciting video clips. Covering a spectrum of emotions, this dataset holds immense value for those diving deep into human cognitive responses, psychological research, and emotion-based analyses. Dataset Highlights: Precise Timestamps: Capturing the exact millisecond of EEG data acquisition, ensuring unparalleled granularity. Brainwave Metrics: Illuminating the variety of cognitive states through the prism of Delta, Theta, Alpha, Beta, and Gamma waves. Motion Data: Encompassing the device's movement in three dimensions for enhanced contextuality. Auxiliary Indicators: Key elements like the device's positioning, battery metrics, and user-specific actions are meticulously logged. Consent and Ethics: The dataset respects and upholds privacy and ethical standards. Every participant provided informed consent. This endeavor has received the green light from the Ethics Committee at the University of Granada, documented under the reference: 2100/CEIH/2021. A pivotal component of this dataset is its focus on "key moments" within the selected video clips, honing in on periods anticipated to evoke heightened emotional responses. Curated Video Clips within Dataset: Film Emotion Duration (seconds) The Lover Baseline 43 American History X Anger 106 Cry Freedom Sadness 166 Alive Happiness 310 Scream Fear 395 The cornerstone of EKM-ED is its innovative emphasis on these key moments, bringing to light the correlation between distinct cinematic events and specific EEG responses. Key Emotional Moments in Dataset: Film Emotion Key moment timestamps (seconds) American History X Anger 36, 57, 68 Cry Freedom Sadness 112, 132, 154 Alive Happiness 227, 270, 289 Scream Fear 23, 42, 79, 226, 279, 299, 334 Citation: Gilman, T. L., et al. (2017). A film set for the elicitation of emotion in research. Behavior Research Methods, 49(6). Link to the study With its unparalleled depth and focus, the EmoKey Moments EEG Dataset aims to advance research in fields such as neuroscience, psychology, and affective computing, providing a comprehensive platform for understanding and analyzing human emotions through EEG data.   ——————————————————————————————————— FOLDER STRUCTURE DESCRIPTION ——————————————————————————————————— - questionnaires: all there response questionnaires (Spanish); raw and preprocessed Including SAM | ——preprocessed: Ficha_Evaluacion_Participante_SAM_Refactored.csv: the SAM responses for every film clip - key_moments: the key moment timestamps for every emotion’s clip - muse_wearable_data: XXXX | |—raw |——1: ID = 1 of subject |————muse: EEG data of Muse device |—————————ANGER_XXX.csv :  leg data of the anger elicitation |—————————FEAR_XXX.csv :  leg data of the fear elicitation |—————————HAPPINESS_XXX.csv :  leg data of the happiness elicitation |—————————SADNESS_XXX.csv :  leg data of the sadness elicitation |————order: film elicitation order of play: For example: HAPPINESS,SADNESS,ANGER,FEAR … | |—preprocessed |——unclean-signals: without removing EEG artifacts, noise, etc. |————muse: EEG data of Muse device |—————————0.0078125: data downsampled to 128 Hz from 256Hz recorded |——clean-signals: removed EEG artifacts, noise, etc. |————muse: EEG data of Muse device |—————————0.0078125: data downsampled to 128 Hz from 256Hz recorded The ethical consent for this dataset was provided by La Comisión de Ética en Investigación de la Universidad de Granada, as documented in the approval titled: 'DETECCIÓN AUTOMÁTICA DE LAS EMOCIONES BÁSICAS Y SU INFLUENCIA EN LA TOMA DE DECISIONES MEDIANTE WEARABLES Y MACHINE LEARNING' registered under 2100/CEIH/2021.