Neighborhood Region Smoothing Regularization for Finding Flat Minima In Deep Neural Networks

Yang Zhao, Hao Zhang; Proceedings of the Asian Conference on Computer Vision (ACCV), 2022, pp. 4697-4710

Abstract


Due to diverse architectures in deep neural networks (DNNs) with severe overparameterization, regularization techniques are critical for finding optimal solutions in the huge hypothesis space. In this paper, we propose an effective regularization technique, called Neighborhood Region Smoothing (NRS). NRS leverages the finding that models would benefit from converging to flat minima, and tries to regularize the neighborhood region in weight space to yield approximate outputs. Specifically, gap between outputs of models in the neighborhood region is gauged by a defined metric based on Kullback-Leibler divergence. This metric could provide insights in accordance with the minimum description length principle on interpreting flat minima. By minimizing both this divergence and empirical loss, NRS could explicitly drive the optimizer towards converging to flat minima, and meanwhile could be compatible with other common regularizations. We confirm the effectiveness of NRS by performing image classification tasks across a wide range of model architectures on commonly-used datasets such as CIFAR and ImageNet, where generalization ability could be universally improved. Also, we empirically show that the minima found by NRS would have relatively smaller Hessian eigenvalues compared to the conventional method, which is considered as the evidence of flat minima.

Related Material


[pdf] [arXiv] [code]
[bibtex]
@InProceedings{Zhao_2022_ACCV, author = {Zhao, Yang and Zhang, Hao}, title = {Neighborhood Region Smoothing Regularization for Finding Flat Minima In Deep Neural Networks}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2022}, pages = {4697-4710} }