Pruning rPPG Networks: Toward Small Dense Network With Limited Number of Training Samples

Changchen Zhao, Pengcheng Cao, Shoushuai Xu, Zhengguo Li, Yuanjing Feng; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2022, pp. 2055-2064

Abstract


Neural network pruning reduces network complexity and storage by removing unimportant connections in the network, enabling network miniaturization, fast training and inference, easy deployment to portable devices, etc. The emerging lottery ticket hypotheses and sparse initialization technique have shed new lights on the pruning research. However, few research focuses on the pruning of the networks for remote photoplethysmography (rPPG) pulse signal extraction. Opposite to the existing pruning researches that prune large network, rPPG networks are relatively small. It is interesting to see how it behaves when the pruning is applied. In this paper, we investigate the behavior of common pruning techniques when applied to an existing rPPG network. Experiments on PURE dataset show that the pruning rate decay is beneficial to the performance improvement, whereas the connection regeneration has a detrimental effect. Given the same final sparsity, dense initialization generally performs better than sparse initialization. The network seems insensitive to initial sparsity. The combination s_i=1.0, s_f=0.1, with decay, and without regeneration is the best trade-off between SNR and FLOPs, achieving average SNR 9.78 dB, increased by 0.48 dB in comparison with the original PhysNet.

Related Material


[pdf]
[bibtex]
@InProceedings{Zhao_2022_CVPR, author = {Zhao, Changchen and Cao, Pengcheng and Xu, Shoushuai and Li, Zhengguo and Feng, Yuanjing}, title = {Pruning rPPG Networks: Toward Small Dense Network With Limited Number of Training Samples}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2022}, pages = {2055-2064} }