Continuous Estimation of Emotional Change Using Multimodal Affective Responses

Kenta Masui, Takumi Nagasawa, Hirokazu Doi, Norimichi Tsumura; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2020, pp. 290-291

Abstract


Emotions have a significant effect on our daily behavior, such as perception, memory, and decision making. For this reason, interest in considering the emotions of the user in a human-computer interface has recently increased. This is important for future interface applications, which are expected to operate in harmony with humans. In this paper, we present our approach to instantaneously detecting the emotions of video viewers from remote measurement using an RGB camera. Facial expression and physiological responses, such as heart rate and pupil diameter, were measured by analyzing facial videos. We also verified the effectiveness of the contactless measurement by acquiring electroencephalogram signals using a contact-type electroencephalograph. By combining the measured responses into multimodal features and using machine learning, we showed that the results of emotion estimation were better than estimates made from only single-mode features.

Related Material


[pdf]
[bibtex]
@InProceedings{Masui_2020_CVPR_Workshops,
author = {Masui, Kenta and Nagasawa, Takumi and Doi, Hirokazu and Tsumura, Norimichi},
title = {Continuous Estimation of Emotional Change Using Multimodal Affective Responses},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2020}
}