Multi-View Body Image-Based Prediction of Body Mass Index and Various Body Part Sizes
This paper proposes a novel model for predicting body mass index and various body part sizes using front, side, and back body images. The model is trained on a large dataset of labeled images. The results show that the model can accurately predict body mass index and various body part sizes such as chest, waist, hip, thigh, forearm, and shoulder width. One significant advantage of the proposed model is that it can use multiple views of the body to achieve more accurate predictions, overcoming the limitations of models that only used a single image. The model also does not require complex pre-processing or feature extraction, making it straightforward to apply in practice. We also explore the impact of different environmental factors, such as clothing and posture, on the model's performance. The findings show that the model is relatively insensitive to posture but is more sensitive to clothing, emphasizing the importance of controlling for clothing when using this model. Overall, the proposed model represents a step forward in predicting body mass index and various body part sizes from images. The model's accuracy, convenience, and ability to use multiple views of the body make it a promising tool for a wide range of applications. The proposed method is expected to be utilized as a parameter for accurate sensing of various vision-based non-contact biomarkers, in addition to body mass index inference.