Toward Affective XAI: Facial Affect Analysis for Understanding Explainable Human-AI Interactions

Luke Guerdan, Alex Raymond, Hatice Gunes; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, 2021, pp. 3796-3805

Abstract


As machine learning approaches are increasingly used to augment human decision-making, eXplainable Artificial Intelligence (XAI) research has explored methods for communicating system behavior to humans. However, these approaches often fail to account for the affective responses of humans as they interact with explanations. Facial affect analysis, which examines human facial expressions of emotions, is one promising lens for understanding how users engage with explanations. Therefore, in this work, we aim to (1) identify which facial affect features are pronounced when people interact with XAI interfaces, and (2) develop a multitask feature embedding for linking facial affect signals with participants' use of explanations. Our analyses and results show that the occurrence and values of facial AU1 and AU4, and Arousal are heightened when participants fail to use explanations effectively. This suggests that facial affect analysis should be incorporated into XAI to personalize explanations to individuals' interaction styles and to adapt explanations based on the difficulty of the task performed.

Related Material


[pdf] [arXiv]
[bibtex]
@InProceedings{Guerdan_2021_ICCV, author = {Guerdan, Luke and Raymond, Alex and Gunes, Hatice}, title = {Toward Affective XAI: Facial Affect Analysis for Understanding Explainable Human-AI Interactions}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops}, month = {October}, year = {2021}, pages = {3796-3805} }