-
[pdf]
[supp]
[arXiv]
[bibtex]@InProceedings{Seo_2025_WACV, author = {Seo, Seonguk and Uzunbas, Mustafa Gokhan and Han, Bohyung and Cao, Sara and Lim, Ser-Nam}, title = {Metric Compatible Training for Online Backfilling in Large-Scale Retrieval}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {1537-1545} }
Metric Compatible Training for Online Backfilling in Large-Scale Retrieval
Abstract
Backfilling is the process of re-extracting all gallery embeddings from upgraded models in image retrieval systems. It inevitably spends a prohibitively large amount of computational cost and even entails the downtime of the service. Although backward-compatible learning sidesteps this challenge by tackling query-side representations this leads to suboptimal solutions in principle because gallery embeddings cannot benefit from model upgrades. We address this dilemma by introducing an online backfilling algorithm which enables us to achieve a progressive performance improvement during the backfilling process without sacrificing the full performance of the new model after the completion of backfilling. To this end we first show that a simple distance rank merge is a reasonable option for online backfilling. Then we incorporate a reverse transformation module for more effective and efficient merging which is further enhanced by adopting metric-compatible contrastive learning. These two components help to make the distances of old and new models compatible resulting in desirable merge results during backfilling with no extra computational overhead. Extensive experiments show the benefit of our framework on four standard benchmarks in various settings.
Related Material