Analysis of Children’s Prosodic Features Using Emotion Based Utterances in Urdu Language


  • S. Khan Department of Computer & Information Systems Engineering, N.E.D. UET, Pakistan
  • S. A. Ali Department of Computer Science & Information Technology, NED University of Engineering and Technology, Karachi, Pakistan
  • J. Sallar Department of Computer Science, Sir Syed University of Engineering and Technology, Karachi, Pakistan


Emotion plays a significant role in identifying the states of a speaker using spoken utterances. Prosodic features add sense in spoken utterances providing speaker emotions. The objective of this research is to analyze the behavior of prosodic features (individual and in combination with others’ prosodic features) with different learning classifiers on emotion based utterances of children in the Urdu language. In this paper, three different prosodic features (intensity, pitch, formant and their combinations) with five different learning classifiers(ANN, J-48, K-star, Naïve Bayes, decision stump) and four basic emotions (happy, sad, angry, and neutral) were used to develop the experimental framework. Demonstrative experiments expressed that, in terms of classification accuracy, artificial neural networks show significant results with both individual and combination of prosodic features in comparison with other learning classifiers.


speech, emotion, recognition, learning, classifiers, prosodic, features, language, Urdu, Pakistan


Download data is not yet available.


S. A. Ali, A. Khan, N. Bashir, “Analyzing the Impact of Prosodic Feature (Pitch) on Learning Classifiers for Speech Emotion Corpus”, International Journal of Information Technology and Computer Science, Vol. 2, pp. 54-59, 2015 DOI:

P. Ekman, “An argument for basic emotions”, Cognition & Emotion, Vol. 6, No. 3. pp. 169–200, 1992 DOI:

I. Chiriacescu, Automatic Emotion Analysis Based on Speech, MSc Thesis, Delft University of Technology, 2009

M. B. Mustafa, R. N. Ainon, R. Zainuddin, Z. M. Don, G. Knowles, S. Mokhtar, “Prosodic Analysis and Modelling for Malay”, Malaysian Journal of Computer Science, Vol. 23, No. 2, pp. 102–110, 2010 DOI:

J. Rong, G. Li, Y. P. P. Chen, “Acoustic feature selection for automatic emotion recognition from speech”, Information Processing & Management, Vol. 45, No. 3, pp. 315–328, 2009 DOI:

J. Pribil, A. Pribilova, “Determination of formant features in Czech and Slovak for GMM emotional speech classifier”, Radioengineering, Vol. 22, No. 1, pp. 52–59, 2013 DOI:

M. El Ayadi, M. S. Kamel, F. Karray, “Survey on speech emotion recognition: Features, classification schemes, and databases”, Pattern Recognition, Vol. 44, No. 3, pp. 572–587, 2011 DOI:

A. Utane, S. Nalbalwar, “Emotion recognition through Speech”, 2nd National Conference on Innovative Paradigms in Engineering & Technology, International Journal of Applied Information Systems, pp. 5-8, 2013

K. S. Rao, S. G. Koolagudi, R. R. Vempada, “Emotion recognition from speech using global and local prosodic features”, International Journal of Speech Technology, Vol. 16, No. 2, pp. 143–160, 2013 DOI:

P. Olivier, J. Wallace, “Digital technologies and the emotional family”, International Journal of Human-Computer Studies, Vol. 67, No. 2, pp. 204–214, 2009 DOI:

P. Pattnaik, “Impact of Emotion on Prosody Analysis”, IOSR Journal of Computer Engineering, Vol. 5, No. 4, pp. 10–15, 2012 DOI:

W. L. Jarrold, Towards a theory of affective mind: Computationally modeling the generativity of goal appraisal, PhD Thesis, University of Texas at Austin, 2004

S. A. Ali, M. Andleeb, N. G. Haider, D. R. Khan, “Evaluating the Performance of Learning Classifiers and Effect of Emotions and Spectral Features on Speech Utterances”, International Journal of Computer Science and Information Security, Vol. 14, No. 10, pp. 406–412, 2016

P. Boersma, D. Weenink, Praat: doing phonetics by computer, available at:


How to Cite

S. Khan, S. A. Ali, and J. Sallar, “Analysis of Children’s Prosodic Features Using Emotion Based Utterances in Urdu Language”, Eng. Technol. Appl. Sci. Res., vol. 8, no. 3, pp. 2954–2957, Jun. 2018.


Abstract Views: 849
PDF Downloads: 439

Metrics Information

Most read articles by the same author(s)