There has been a significant increase in the interest for the task of classifying
demographic profiles i.e. race and ethnicity. Ethnicity is a significant human
characteristic and applying facial image data for the discrimination of ethnicity is
integral to face-related biometric systems. Given the diversity in the application
of ethnicity-specific information such as face recognition and iris recognition, and
the availability of image datasets for more commonly available human
populations, i.e. Caucasian, African-American, Asians, and South-Asian Indians.
A gap has been identified for the development of a system which analyses the
full-face and its individual feature-components (eyes, nose and mouth), for the
Pakistani ethnic group. An efficient system is proposed for the verification of the
Pakistani ethnicity, which incorporates a two-tier (computer vs human) approach.
Firstly, hand-crafted features were used to ascertain the descriptive nature of a
frontal-image and facial profile, for the Pakistani ethnicity. A total of 26 facial
landmarks were selected (16 frontal and 10 for the profile) and by incorporating
2 models for redundant information removal, and a linear classifier for the binary
task. The experimental results concluded that the facial profile image of a
Pakistani face is distinct amongst other ethnicities. However, the methodology
consisted of limitations for example, low performance accuracy, the laborious
nature of manual data i.e. facial landmark, annotation, and the small facial image
dataset. To make the system more accurate and robust, Deep Learning models
are employed for ethnicity classification. Various state-of-the-art Deep models
are trained on a range of facial image conditions, i.e. full face and partial-face
images, plus standalone feature components such as the nose and mouth. Since
ethnicity is pertinent to the research, a novel facial image database entitled
Pakistani Face Database (PFDB), was created using a criterion-specific selection
process, to ensure assurance in each of the assigned class-memberships, i.e.
Pakistani and Non-Pakistani. Comparative analysis between 6 Deep Learning
models was carried out on augmented image datasets, and the analysis
demonstrates that Deep Learning yields better performance accuracy compared
to low-level features. The human phase of the ethnicity classification framework
tested the discrimination ability of novice Pakistani and Non-Pakistani
participants, using a computerised ethnicity task. The results suggest that
humans are better at discriminating between Pakistani and Non-Pakistani full
face images, relative to individual face-feature components (eyes, nose, mouth),
struggling the most with the nose, when making judgements of ethnicity. To
understand the effects of display conditions on ethnicity discrimination accuracy, two conditions were tested; (i) Two-Alternative Forced Choice (2-AFC) and (ii)
Single image procedure. The results concluded that participants perform
significantly better in trials where the target (Pakistani) image is shown alongside
a distractor (Non-Pakistani) image. To conclude the proposed framework,
directions for future study are suggested to advance the current understanding of
image based ethnicity verification. / Acumé Forensic
Identifer | oai:union.ndltd.org:BRADFORD/oai:bradscholars.brad.ac.uk:10454/18757 |
Date | January 2020 |
Creators | Jilani, Shelina K. |
Contributors | Ugail, Hassan, Logan, Andrew J. |
Publisher | University of Bradford, Faculty of Engineering and Informatics. School of Media, Design and Technology |
Source Sets | Bradford Scholars |
Language | English |
Detected Language | English |
Type | Thesis, doctoral, PhD |
Rights | <a rel="license" href="http://creativecommons.org/licenses/by-nc-nd/3.0/"><img alt="Creative Commons License" style="border-width:0" src="http://i.creativecommons.org/l/by-nc-nd/3.0/88x31.png" /></a><br />The University of Bradford theses are licenced under a <a rel="license" href="http://creativecommons.org/licenses/by-nc-nd/3.0/">Creative Commons Licence</a>. |
Page generated in 0.0023 seconds