Uncertainty-Aware Prototypical Networks with Monte Carlo Dropout for Few-Shot Image Classification

Authors

  • Syeda Roohi Fatema Department of Information Science and Engineering, Ramaiah Institute of Technology, Bengaluru, India
  • Sumana Maradithaya Department of Information Science and Engineering, Ramaiah Institute of Technology, Bengaluru, India
Volume: 15 | Issue: 6 | Pages: 29860-29865 | December 2025 | https://doi.org/10.48084/etasr.14448

Abstract

Meta-learning is a transformative method for accelerating efficient task adaptation within significantly constrained data regimens. Its multidimensional applications expand to the domains of few-shot classification, reinforcement learning, and domain generalization. Deep learning architectures necessitate expansive datasets to accomplish robust generalization, whereas meta-learning equips computational models with an exceptional capacity for competent adaptability within data-constrained and heterogeneous learning scenarios. However, despite its considerable potential, present meta-learning methods are substantially hindered by a critical epistemic limitation, i.e., their systematic inability to provide uncertainty estimates. This insufficiency results in overconfident predictive outputs that invariably hinder the practical implementation of such cutting-edge computational frameworks in real-world scenarios. In response to these limitations, this study presents a pioneering architectural framework that integrates prototypical networks with Monte Carlo (MC) dropout. Prototypical networks, established for their exceptional efficiency in few-shot learning scenarios, is one of the prominent meta learning algorithms. The proposed method utilizes class prototypes within latent embedding spaces for each class to expedite robust task generalization. MC dropout provides a meticulous probabilistic mechanism for uncertainty quantification through stochastic representations. The proposed algorithm improves the model's generalization accuracy and prediction reliability using uncertainty estimation. Experiments on image classification tasks demonstrate that dropout regularization improves performance and reduces overfitting in metric-based meta-learning algorithms. This comprehensive paradigm establishes a significant meta-learning application by overcoming the limitations of adaptation and uncertainty quantification, with profound implications for critical real-world scenarios.

Keywords:

meta learning, few-shot learning, Monte Carlo dropout, uncertainty quantification, image classification, artificial intelligence

Downloads

Download data is not yet available.

References

Y. Wang, Q. Yao, J. T. Kwok, and L. M. Ni, ''Generalizing from a Few Examples: A Survey on Few-shot Learning,'' ACM Computing Surveys, vol. 53, no. 3, Mar. 2020, Art. no. 63. DOI: https://doi.org/10.1145/3386252

A. Bansal, R. Sharma, and M. Kathuria, ''A Systematic Review on Data Scarcity Problem in Deep Learning: Solution and Applications,'' ACM Computing Surveys, vol. 54, no. 10, Jan. 2022, Art. no. 208. DOI: https://doi.org/10.1145/3502287

T. M. Hospedales, A. Antoniou, P. Micaelli, and A. J. Storkey, ''Meta-Learning in Neural Networks: A Survey,'' IEEE Transactions on Pattern Analysis and Machine Intelligence, pp. 5149-5169, 2021. DOI: https://doi.org/10.1109/TPAMI.2021.3079209

H. Gharoun, F. Momenifar, F. Chen, and A. H. Gandomi, ''Meta-learning Approaches for Few-Shot Learning: A Survey of Recent Advances,'' ACM Computing Surveys, vol. 56, no. 12, Apr. 2024, Art. no. 294. DOI: https://doi.org/10.1145/3659943

A. Vettoruzzo, M. R. Bouguelia, J. Vanschoren, T. Rögnvaldsson, and K. Santosh, ''Advances and Challenges in Meta-Learning: A Technical Review,'' IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 46, no. 7, pp. 4763–4779, July 2024. DOI: https://doi.org/10.1109/TPAMI.2024.3357847

J. Wang et al., ''Neuromodulated Meta-Learning.'' arXiv, Nov. 11, 2024.

M. Abdar et al., ''A review of uncertainty quantification in deep learning: Techniques, applications and challenges,'' Information Fusion, vol. 76, pp. 243–297, Dec. 2021. DOI: https://doi.org/10.1016/j.inffus.2021.05.008

A. G. Khoee, Y. Yu, and R. Feldt, ''Domain generalization through meta-learning: a survey,'' Artificial Intelligence Review, vol. 57, no. 10, Sept. 2024, Art. no. 285. DOI: https://doi.org/10.1007/s10462-024-10922-z

W. Guo, F. Zhuang, X. Zhang, Y. Tong, and J. Dong, ''A comprehensive survey of federated transfer learning: challenges, methods and applications,'' Frontiers of Computer Science, vol. 18, no. 6, Dec. 2024, Art. no. 186356. DOI: https://doi.org/10.1007/s11704-024-40065-x

Y. Gal and Z. Ghahramani, ''Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning,'' in Proceedings of The 33rd International Conference on Machine Learning, June 2016, New York, NY, USA, pp. 1050–1059.

A. Lemay et al., ''Improving the repeatability of deep learning models with Monte Carlo dropout,'' npj Digital Medicine, vol. 5, no. 1, Nov. 2022, Art. no. 174. DOI: https://doi.org/10.1038/s41746-022-00709-3

J. Xie et al., ''Advanced Dropout: A Model-Free Methodology for Bayesian Dropout Optimization,'' IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 44, no. 9, pp. 4605–4625, Sept. 2022.

A. Shamsi, H. Asgharnezhad, A. Tajally, S. Nahavandi, and H. Leung, ''An Uncertainty-aware Loss Function for Training Neural Networks with Calibrated Predictions.'' arXiv, Feb. 06, 2023.

X. Liang et al., "R-Drop: Regularized Dropout for Neural Networks," in Advances in Neural Information Processing Systems, 2021, vol. 34, pp. 10890–10905.

M. Hasan, I. Hossain, A. Rahman, and S. Nahavandi, ''Controlled Dropout for Uncertainty Estimation,'' in 2023 IEEE International Conference on Systems, Man, and Cybernetics (SMC), July 2023, HI, USA, pp. 973–980. DOI: https://doi.org/10.1109/SMC53992.2023.10394101

M. Mujiyanto, A. Setyanto, K. Kusrini, and E. Utami, ''Swin Transformer with Enhanced Dropout and Layer-wise Unfreezing for Facial Expression Recognition in Mental Health Detection,'' Engineering, Technology & Applied Science Research, vol. 14, no. 6, pp. 19016–19023, Dec. 2024. DOI: https://doi.org/10.48084/etasr.9139

C. Finn, P. Abbeel, and S. Levine, ''Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks,'' in Proceedings of the 34th International Conference on Machine Learning, July 2017, Sydney, Australia, pp. 1126–1135.

J. Snell, K. Swersky, and R. Zemel, ''Prototypical Networks for Few-shot Learning,'' in Advances in Neural Information Processing Systems, 2017, vol. 30, Art. no. 4077–4087.

N. Srivastava, G. Hinton, A. Krizhevsky, I. Sutskever, and R. Salakhutdinov, "Dropout: A Simple Way to Prevent Neural Networks from Overfitting," Journal of Machine Learning Research, vol. 15, no. 56, pp. 1929–1958, 2014.

X. Li, X. Yang, Z. Ma, and J. H. Xue, ''Deep metric learning for few-shot image classification: A Review of recent developments,'' Pattern Recognition, vol. 138, June 2023, Art. no. 109381. DOI: https://doi.org/10.1016/j.patcog.2023.109381

L. Deng, ''The MNIST Database of Handwritten Digit Images for Machine Learning Research [Best of the Web],'' IEEE Signal Processing Magazine, vol. 29, no. 6, pp. 141–142, Aug. 2012. DOI: https://doi.org/10.1109/MSP.2012.2211477

O. Vinyals, C. Blundell, T. Lillicrap, K. Kavukcuoglu, and D. Wierstra, ''Matching Networks for One Shot Learning,'' in Advances in Neural Information Processing Systems, 2016, vol. 29.

A. Krizhevsky and G. Hinton, "Learning multiple layers of features from tiny images," University of Toronto, Canada, Technical Report, 2009.

M. Yang and G. Thung, "Classification of Trash for Recyclability Status," Stanford University, Course Project CS 229.

Downloads

How to Cite

[1]
S. R. Fatema and S. Maradithaya, “Uncertainty-Aware Prototypical Networks with Monte Carlo Dropout for Few-Shot Image Classification”, Eng. Technol. Appl. Sci. Res., vol. 15, no. 6, pp. 29860–29865, Dec. 2025.

Metrics

Abstract Views: 323
PDF Downloads: 267

Metrics Information