RGB-W: When Vision Meets Wireless

Alexandre Alahi, Albert Haque, Li Fei-Fei; Proceedings of the IEEE International Conference on Computer Vision (ICCV), 2015, pp. 3289-3297

Abstract


Inspired by the recent success of RGB-D cameras, we propose the enrichment of RGB data with an additional "quasi-free" modality, namely, the wireless signal (e.g., wifi or Bluetooth) emitted by individuals' cell phones, referred to as RGB-W. The received signal strength acts as a rough proxy for depth and a reliable cue on their identity. Although the measured signals are highly noisy (more than 2m average localization error), we demonstrate that the combination of visual and wireless data significantly improves the localization accuracy. We introduce a novel image-driven representation of wireless data which embeds all received signals onto a single image. We then indicate the ability of this additional data to (i) locate persons within a sparsity-driven framework and to (ii) track individuals with a new confidence measure on the data association problem. Our solution outperforms existing localization methods by a significant margin. It can be applied to the millions of currently installed RGB cameras to better analyze human behavior and offer the next generation of high-accuracy location-based services.

Related Material


[pdf]
[bibtex]
@InProceedings{Alahi_2015_ICCV,
author = {Alahi, Alexandre and Haque, Albert and Fei-Fei, Li},
title = {RGB-W: When Vision Meets Wireless},
booktitle = {Proceedings of the IEEE International Conference on Computer Vision (ICCV)},
month = {December},
year = {2015}
}