A Comprehensive Within-Subject Analysis of EEGNet for SEED-IV Emotion Recognition: Subject Variability and Per-Class Performance
Received: 24 December 2025 | Revised: 4 February 2026 and 22 February 2026 | Accepted: 23 February 2026 | Online: 17 March 2026
Corresponding author: Sujata Kulkarni
Abstract
This paper presents a comprehensive within-subject analysis of the EEGNet model's performance on the SEED-IV dataset for multi-class emotion recognition. This analysis accounts for the individual contributions of each of the 15 subjects by considering a standardized preprocessing procedure, subject-specific splitting of the data into training, validation, and test sets, and a trained model architecture and process for each subject to account for individual variability. Accuracy, precision, and recall of the model for each subject, as well as class-specific measures, were used to evaluate model performance for the differentiation of each emotion on the SEED-IV dataset, analyzing the amount of each emotion that the model can reproduce on the test dataset, presenting an accurate interpretation of the distance between classes by analyzing confusion matrices, as well as the capability of the model to differentiate learned emotions on the test dataset by means of the two-dimensional visualization of the learned features extracted by the model. The study presents three contributions: (i) subject clustering analysis, (ii) per-class recall analysis, and (iii) validation that the raw EEG waveform achieves competitive performance.
Keywords:
emotion recognition, EEGNet, SEED IV, brain computer interfaceDownloads
References
R. B. Jadekar, P. Basavaraju, S. B. S. Kumar, and M. Rafi, "Emotion Recognition from EEG Signals Using Principal Component Analysis and Random Forest Classifier," Engineering, Technology & Applied Science Research, vol. 15, no. 5, pp. 27300–27305, Oct. 2025.
L. Monish and S. G. Shaila, "A Hybrid Fuzzy CNN–LSTM Approach for Emotion Recognition from EEG–ECG Physiological Signals," Engineering, Technology & Applied Science Research, vol. 15, no. 6, pp. 30405–30411, Dec. 2025.
M. Jehosheba Margaret and N. M. Masoodhu Banu, "Performance analysis of EEG based emotion recognition using deep learning models," Brain-Computer Interfaces, vol. 10, no. 2–4, pp. 79–98, Oct. 2023.
K. Henni, N. Mezghani, A. Mitiche, L. Abou-Abbas, and A. Benazza-Ben Yahia, "An Effective Deep Neural Network Architecture for EEG-Based Recognition of Emotions," IEEE Access, vol. 13, pp. 4487–4498, 2025.
Z. Wang, J. Yu, J. Gao, Y. Bai, and Z. Wan, "MutaPT: A Multi-Task Pre-Trained Transformer for Classifying State of Disorders of Consciousness Using EEG Signal," Brain Sciences, vol. 14, no. 7, Jul. 2024, Art. no. 688.
J. Pan et al., "ST-SCGNN: A Spatio-Temporal Self-Constructing Graph Neural Network for Cross-Subject EEG-Based Emotion Recognition and Consciousness Detection," IEEE Journal of Biomedical and Health Informatics, vol. 28, no. 2, pp. 777–788, Oct. 2024.
Y. Xu, Y. Gao, Z. Zhang, and S. Du, "Emotional recognition of EEG signals utilizing residual structure fusion in bi-directional LSTM," Complex & Intelligent Systems, vol. 11, no. 1, Dec. 2024, Art. no. 62.
S. Koelstra et al., "DEAP: A Database for Emotion Analysis ;Using Physiological Signals," IEEE Transactions on Affective Computing, vol. 3, no. 1, pp. 18–31, Jan. 2012.
S. Kulkarni and P. Patil, "EEG-based Emotion Recognition based on DEAP Dataset with Genetic Algorithm Augmented Multi-Layer Perceptron," in 2023 OITS International Conference on Information Technology (OCIT), Raipur, India, Sep. 2023, pp. 687–692.
R. D. Gaddanakeri, M. M. Naik, S. Kulkarni, and P. Patil, "Analysis of EEG Signals in the DEAP Dataset for Emotion Recognition using Deep Learning Algortihms," in 2024 IEEE 9th International Conference for Convergence in Technology (I2CT), Apr. 2024, pp. 1–7.
S. Bagherzadeh, A. Shalbaf, A. Shoeibi, M. Jafari, R.-S. Tan, and U. R. Acharya, "Developing an EEG-Based Emotion Recognition Using Ensemble Deep Learning Methods and Fusion of Brain Effective Connectivity Maps," IEEE Access, vol. 12, pp. 50949–50965, 2024.
W.-L. Zheng, W. Liu, Y. Lu, B.-L. Lu, and A. Cichocki, "EmotionMeter: A Multimodal Framework for Recognizing Human Emotions," IEEE Transactions on Cybernetics, vol. 49, no. 3, pp. 1110–1122, Mar. 2019.
"SEED IV Dataset." https://bcmi.sjtu.edu.cn/home/seed/seed-iv.html.
Downloads
How to Cite
License
Copyright (c) 2026 Sujata Kulkarni, Prakashgoud Patil

This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors who publish with this journal agree to the following terms:
- Authors retain the copyright and grant the journal the right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) after its publication in ETASR with an acknowledgement of its initial publication in this journal.
