Abstract
Since the last decade, the area of recognizing gender of a person from an image of his/her face has been playing an important role in the research field. A automatic gender recognition is an important concept, essential for many fields like forensic science and automatic payment system. However, it is very onerous due to high variability factors such as illumination, expression, pose, age, scales, camera quality and occlusion. Humans can easily recognize the difference between genders, but it is a critical task for computer. To overcome this issue, many experimental results have been explained in the existing literature as per the advancement of machine vision. But, still definite optimal solution could not be found. For practical usage, a novel full approach to gender classification which is mainly based on image intensity variation, shape and texture features is proposed in this work. These multi-attribute features are mixed at different spatial scales or levels. The proposed novel system uses two datasets such as Facial ExpressIon Set (FEI) dataset and self-built dataset with various facial expressions. In this research, eight local directional pattern algorithms are used for extracting facial edge feature. Local binary pattern is also used for extracting texture feature, whereas intensity as a added feature. Finally, spatial histograms computed from the above features are concatenated to build a gender descriptor. The proposed descriptor efficiently extracts discriminating information from three different levels, including regional, global and directional level. After the extraction of a gender descriptor, effective linear kernel-based support vector machine superior to other classifiers is used to classify the face image as either male or female. The experimental results show that the classification accuracy obtained with the mixture of outcome of multi-scale, multi-block, distinct and prime feature classification is better than having a single-scaled image. It is worth mentioning that the proposed approach is implemented in MATLAB which achieves an accuracy of 99% on the FEI face dataset (200 faces) and 94% on self-built dataset (200 faces).









Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Change history
16 October 2024
This article has been retracted. Please see the Retraction Notice for more detail: https://doi.org/10.1007/s00500-024-10186-3
References
Azzopardi G, Greco A, Saggese A, Vento M (2016) Fusion of domain-specific and trainable features for gender recognition from face images. In: IEEE access, vol 4
Ballihi L, Ben Amar B, Daoudi M, Srivastara A, Aboutajdine D (2012) Boosting 3-D Geometric features for efficient face recognition and gender classification. IEEE Trans Inf Forensics Secur 7:1766–1779
Bekios-Calfa J, Buenaposada JM, Baumela L (2014) Robust gender recognition by exploiting facial attributes dependencies. Pattern Recognit Lett 36:228–234
Carcagni P (2015) Facial expression recognition and histograms of oriented gradients: a comprehensive study. In: Springer plus, vol 3, Oct 2015
Faraji MR, Qi X (2014) Face recognition under varying illumination based on adaptive homomorphic eight local directional patterns. IET Comput Vis 9(3):390–399
Gilani SZ, Mian A (2014) Perceptual differences between men and women: a 3D facial morphometric perspective. In; IEEE 22nd international conference on pattern recognition 2014
Guo Y, Zhao G, Pietikäinen M (2012) Discriminative features for texture description. Pattern Recognit 45:3834–3843
Han H, Shan S, Chen X, Gao W (2013) A comparative study on illumination preprocessing in face recognition. Pattern Recognit 46(6):1691–1699
Huerta I, Fernández C, Prati A (2015) Facial age estimation through the fusion of texture and local appearance descriptors. In: European conference on computer vision, LNCS, Springer, Berlin, vol 8926, pp 667–681
Juha Ylioinas J, Hadid A, Guo Y, Pietikäinen M (2012) Efficient image appearance description using dense sampling based local binary patterns. Proceedings of the 11th Asian conference on computer vision—volume part III, pp 375–388
Kim D-J, Lee S-H, Sohn M-K (2013) Face recognition via local directional pattern. IEEE Int J Secur Appl 7(2):191–200
Kotsia I, Pitas I (2007) Facial expression recognition in image sequences using geometric deformation features and support vector machines. IEEE Trans Image Process 16(1):172–187
Li X-Y, Lin Z-X (2018) Face recognition based on HOG and fast PCA algorithm. In: Proceedings of the fourth Euro-China conference on intelligent data analysis and applications. Springer, New York, vol 33(3), pp 333–344. ISSN 1084-7529
Li B, Lian X-C, Bao-Liang L (2012) Gender classification by combining clothing, hair and facial component classifiers. Neuro Computing 76(1):18–27
Lu L, Liu W, Wen Y, Zou Y (2015) Automatical gender detection for unconstrained video sequences based on collaborative representation. In: IEEE international conference, 19–23, ISBN: 978-1-4799-2188-1
Ng C, Tay Y, Goi BM (2012) “Recognizing human gender in computer vision: a survey. In: PRICAI 2012: trends in artificial intelligence. Springer, Berlin, vol 7458, pp 335–346
Ramirez Rivera A, Castillo R, Chae O (2013) Local directional number pattern for face analysis: face and expression recognition. IEEE Trans Image Process 22(5):1740–1752
Satpathy A, Jiang X, Eng H-L (2014) LBP-based edge texture features for object recognition. IEEE, ISSN:1057-7149, ISSN:0925-2312, May 2014
Suhr JK, Eum S, Jung HG, Li G, Kim G, Kim J (2012) Recognizability assessment of facial images for automated teller machine applications. Pattern Recognit 45:1899–1914
Tapia JE, Perez CA (2013) Gender classification based on fusion of different spatial scale features selected by mutual information from histogram of LBP, intensity, and shape. IEEE Trans Inf Forensics Secur 8(3):488–499
Wang H, Jiani H, Deng W (2018) Face feature extraction: a complete review. IEEE Trans 6:2169–3536
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors of the paper declare that they have no conflict of interest.
Ethical approval
There is no human/animal involved in this research work. We agree that we have used our own data.
Additional information
Communicated by P. Pandian.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliation.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Geetha, A., Sundaram, M. & Vijayakumari, B. RETRACTED ARTICLE: Gender classification from face images by mixing the classifier outcome of prime, distinct descriptors. Soft Comput 23, 2525–2535 (2019). https://doi.org/10.1007/s00500-018-03679-5
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00500-018-03679-5