Automatic Recognition of Food Ingestion Environment from the AIM-2 Wearable Sensor

Yuning Huang, M A Hassan, Jiangpeng He, J. Higgins, Megan Mccrory, Heather Eicher-Miller, J. Graham Thomas, Edward Sazonov, Fengqing Zhu; Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, 2024, pp. 3685-3694

Abstract


Detecting an ingestion environment is an important aspect of monitoring dietary intake. It provides insightful information for dietary assessment. However it is a challenging problem where human-based reviewing can be tedious and algorithm-based review suffers from data imbalance and perceptual aliasing problems. To address these issues we propose a neural network-based method with a two-stage training framework that tactfully combines fine-tuning and transfer learning techniques. Our method is evaluated on a newly collected dataset called "UA Free Living Study" which uses an egocentric wearable camera AIM-2 sensor to simulate food consumption in free-living conditions. The proposed training framework is applied to common neural network backbones combined with approaches in the general imbalanced classification field. Experimental results on the collected dataset show that our proposed method for automatic ingestion environment recognition successfully addresses the challenging data imbalance problem in the dataset and achieves a promising overall classification accuracy of 96.63%.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Huang_2024_CVPR, author = {Huang, Yuning and A Hassan, M and He, Jiangpeng and Higgins, J. and Mccrory, Megan and Eicher-Miller, Heather and Thomas, J. Graham and Sazonov, Edward and Zhu, Fengqing}, title = {Automatic Recognition of Food Ingestion Environment from the AIM-2 Wearable Sensor}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops}, month = {June}, year = {2024}, pages = {3685-3694} }