TikTok for Good: Creating a Diverse Emotion Expression Database

Saimourya Surabhi, Bhavik Shah, Peter Washington, Onur Cezmi Mutlu, Emilie Leblanc, Prathamesh Mohite, Arman Husic, Aaron Kline, Kaitlyn Dunlap, Maya McNealis, Bennett Liu, Nick Deveaux, Essam Sleiman, Dennis P. Wall; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2022, pp. 2496-2506

Abstract


Facial expression recognition (FER) is a critical computer vision task for a variety of applications. Despite the widespread use of FER, there is a dearth of racially diverse facial emotion datasets which are enriched for children, teens, and adults. To bridge this gap, we have built a diverse expression recognition database using publicly available videos from TikTok, a video-focused social networking service. We describe the construction of the TikTok Facial expression recognition (FER) database. The dataset is extracted from 6428 videos scraped from TikTok. The videos consist of 9392 distinct individuals and labels for 15 emotion-related prompts. We were able to achieve a F1 score 0.78 for Ekman emotions on expression classification using transfer learning. We hope that the scale and diversity of the TikTokFER dataset will be of use to affective computing practitioners.

Related Material


[pdf]
[bibtex]
@InProceedings{Surabhi_2022_CVPR, author = {Surabhi, Saimourya and Shah, Bhavik and Washington, Peter and Mutlu, Onur Cezmi and Leblanc, Emilie and Mohite, Prathamesh and Husic, Arman and Kline, Aaron and Dunlap, Kaitlyn and McNealis, Maya and Liu, Bennett and Deveaux, Nick and Sleiman, Essam and Wall, Dennis P.}, title = {TikTok for Good: Creating a Diverse Emotion Expression Database}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2022}, pages = {2496-2506} }