Virtual Touch: Computer Vision Augmented Touch-Free Scene Exploration for the Blind or Visually Impaired

Xixuan Julie Liu, Yi Fang; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops, 2021, pp. 1708-1717

Abstract


The Blind or Visually Impaired (BVI) individuals usehaptics much more frequently than the healthy-sighted intheir everyday lives to locate objects and acquire object de-tails. This consequently puts them at higher risk of contract-ing the virus through close contact during a pandemic cri-sis (e.g. COVID-19). Traditional canes only give the BVIslimited perceptive range. Our project develops a wearablesolution named Virtual Touch to augment the BVI's per-ceptive power so they can perceive objects near and farin their surrounding environment in a touch-free mannerand consequently carry out activities of daily living dur-ing pandemics more intuitively, safely, and independently.The Virtual Touch feature contains a camera with a novelpoint-based neural network TouchNet tailored for real-timeblind-centered object detection, and a headphone telling theBVI the semantic labels. Through finger pointing, the BVIend user indicates where he or she is paying attention to rel-ative to their egocentric coordinate system, based on whichwe build attention-driven spatial intelligence.

Related Material


[pdf]
[bibtex]
@InProceedings{Liu_2021_ICCV, author = {Liu, Xixuan Julie and Fang, Yi}, title = {Virtual Touch: Computer Vision Augmented Touch-Free Scene Exploration for the Blind or Visually Impaired}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops}, month = {October}, year = {2021}, pages = {1708-1717} }