Controllable Guide-Space for Generalizable Face Forgery Detection

Ying Guo, Cheng Zhen, Pengfei Yan; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2023, pp. 20818-20827

Abstract


Recent studies on face forgery detection have shown satisfactory performance for methods involved in training datasets, but are not ideal enough for unknown domains. This motivates many works to improve the generalization, but forgery-irrelevant information, such as image background and identity, still exists in different domain features and causes unexpected clustering, limiting the generalization. In this paper, we propose a controllable guide-space (GS) method to enhance the discrimination of different forgery domains, so as to increase the forgery relevance of features and thereby improve the generalization. The well-designed guide-space can simultaneously achieve both the proper separation of forgery domains and the large distance between real-forgery domains in an explicit and controllable manner. Moreover, for better discrimination, we use a decoupling module to weaken the interference of forgery-irrelevant correlations between domains. Furthermore, we make adjustments to the decision boundary manifold according to the clustering degree of the same domain features within the neighborhood. Extensive experiments in multiple in-domain and cross-domain settings confirm that our method can achieve state-of-the-art generalization.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Guo_2023_ICCV, author = {Guo, Ying and Zhen, Cheng and Yan, Pengfei}, title = {Controllable Guide-Space for Generalizable Face Forgery Detection}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2023}, pages = {20818-20827} }