In Search of Life: Learning From Synthetic Data to Detect Vital Signs in Videos

Florin Condrea, Victor-Andrei Ivan, Marius Leordeanu; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2020, pp. 298-299

Abstract


Automatically detecting vital signs in videos, such as the estimation of heart and respiration rates, is a challenging research problem in computer vision with important applications in the medical field. One of the key difficulties in tackling this task is the lack of sufficient supervised training data, which severely limits the use of powerful deep neural networks. In this paper we address this limitation through a novel deep learning approach, in which a recurrent deep neural network is trained to detect vital signs in the infrared thermal domain from purely synthetic data. What is most surprising is that our novel method for synthetic training data generation is general, relatively simple and uses almost no prior medical domain knowledge. Moreover, our system, which is trained in a purely automatic manner and needs no human annotation, also learns to predict the respiration or heart intensity signal for each moment in time and to detect the region of interest that is most relevant for the given task, e.g. the nose area in the case of respiration. We demonstrate the effectiveness of our proposed system on the recent LCAS dataset, where it obtains state-of-the-art performance.

Related Material


[pdf] [video]
[bibtex]
@InProceedings{Condrea_2020_CVPR_Workshops,
author = {Condrea, Florin and Ivan, Victor-Andrei and Leordeanu, Marius},
title = {In Search of Life: Learning From Synthetic Data to Detect Vital Signs in Videos},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops},
month = {June},
year = {2020}
}