Neural Substitution for Branch-level Network Re-parameterization

Seungmin Oh, Jongbin Ryu; Proceedings of the Asian Conference on Computer Vision (ACCV), 2024, pp. 959-975

Abstract


We propose the neural substitution method for network re-parameterization at the branch-level connectivity. This method learns different network topologies to maximize the benefit of the ensemble effect, as re-parameterization allows for the integration of multiple layers during inference following their individual training. Additionally, we introduce a guiding method to incorporate non-linear activation functions into a linear transformation during re-parameterization. Because branch-level connectivity necessitates multiple non-linear activation functions, they must be infused into a single activation with our guided activation method during re-parameterization. Incorporating the non-linear activation function is significant because it overcomes the limitation of the current re-parameterization method, which only works at block-level connectivity. Restricting re-parameterization to block-level connectivity limits the use of network topology, making it challenging to learn a variety of feature representations. On the other hand, the proposed approach learns a considerably richer representation than existing methods due to the unlimited topology, with branch-level connectivity, providing a generalized framework to be applied with other methods. We provide comprehensive experimental evidence for the proposed re-parameterization approach. Our code is available at https://github.com/SoongE/neural_substitution.

Related Material


[pdf] [supp]
[bibtex]
@InProceedings{Oh_2024_ACCV, author = {Oh, Seungmin and Ryu, Jongbin}, title = {Neural Substitution for Branch-level Network Re-parameterization}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2024}, pages = {959-975} }