Z-Score Normalization, Hubness, and Few-Shot Learning

Nanyi Fei, Yizhao Gao, Zhiwu Lu, Tao Xiang; Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2021, pp. 142-151

Abstract


The goal of few-shot learning (FSL) is to recognize a set of novel classes with only few labeled samples by exploiting a large set of abundant base class samples. Adopting a meta-learning framework, most recent FSL methods meta-learn a deep feature embedding network, and during inference classify novel class samples using nearest neighbor in the learned high-dimensional embedding space. This means that these methods are prone to the hubness problem, that is, a certain class prototype becomes the nearest neighbor of many test instances regardless which classes they belong to. However, this problem is largely ignored in existing FSL studies. In this work, for the first time we show that many FSL methods indeed suffer from the hubness problem. To mitigate its negative effects, we further propose to employ z-score feature normalization, a simple yet effective transformation, during meta-training. A theoretical analysis is provided on why it helps. Extensive experiments are then conducted to show that with z-score normalization, the performance of many recent FSL methods can be boosted, resulting in new state-of-the-art on three benchmarks.

Related Material


[pdf]
[bibtex]
@InProceedings{Fei_2021_ICCV, author = {Fei, Nanyi and Gao, Yizhao and Lu, Zhiwu and Xiang, Tao}, title = {Z-Score Normalization, Hubness, and Few-Shot Learning}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2021}, pages = {142-151} }