Misclassifications of Contact Lens Iris PAD Algorithms: Is It Gender Bias or Environmental Conditions?

Akshay Agarwal, Nalini Ratha, Afzel Noore, Richa Singh, Mayank Vatsa; Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV), 2023, pp. 961-970

Abstract


One of the critical steps in biometrics pipeline is detection of presentation attacks, a physical adversary. Several presentation (adversary) attack detection (PAD) algorithms, including iris PAD, are proposed and have shown superlative performance. However, a recent study, on a small-scale database, has highlighted that iris PAD may have gender biases. In this research, we present a rigorous study on gender bias in iris presentation attack detection algorithms using a large-scale and gender-balanced database. The paper provides several interesting observations which can help in building future presentation attack detection algorithms with aim of fair treatment of each demography. In addition, we also present a robust iris presentation attack detection algorithm by combining gender-covariate biased classifiers. The proposed robust classifier not only reduces the difference in accuracy between different genders but also improves the overall performance of the PAD system.

Related Material


[pdf]
[bibtex]
@InProceedings{Agarwal_2023_WACV, author = {Agarwal, Akshay and Ratha, Nalini and Noore, Afzel and Singh, Richa and Vatsa, Mayank}, title = {Misclassifications of Contact Lens Iris PAD Algorithms: Is It Gender Bias or Environmental Conditions?}, booktitle = {Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision (WACV)}, month = {January}, year = {2023}, pages = {961-970} }