Compound Expression Recognition In-the-Wild With AU-Assisted Meta Multi-Task Learning
Facial expression recognition (FER) has received wide attention as an essential part of affective computing. Considering its ambiguity and variety, more attention has been paid to compound expression recognition. Since emotions are generated by the contraction of muscle groups, the action units (AUs) analysis has a vital role in FER. However, AU analysis of compound expression has only been conducted in the laboratory, lacking real-world databases with manually annotated compound expressions and AUs. We construct a real-world affective faces database of compound emotions (RAF-CE), with both compound expression labels and AU labels. Our AU analysis of compound facial expressions conducted on RAF-CE reveals that AU patterns and AU frequencies are different in the lab-controlled compound expressions and the real-world ones. Based on the analysis, we propose a meta-based multi-task learning (MML) for compound FER with AU recognition utilized as an auxiliary task. To fully exploit the priori AU-emotion constraint observed in RAF-CE, an alignment loss is introduced to explicitly match the distribution of AU and FE predictions with each other. Furthermore, we adopt meta-learning to adaptively adjust task weights and improve the positive effect of the auxiliary task. The method can learn refined expression representations latent in the facial topology. Experiments prove the effectiveness of the proposed method.