-
[pdf]
[bibtex]@InProceedings{Tan_2024_ACCV, author = {Tan, Benying and Lin, Jie and Qin, Yang and Ding, Shuxue and Li, Yujie}, title = {Accelerated Deep Nonlinear Dictionary Learning}, booktitle = {Proceedings of the Asian Conference on Computer Vision (ACCV)}, month = {December}, year = {2024}, pages = {4439-4454} }
Accelerated Deep Nonlinear Dictionary Learning
Abstract
Most of the existing dictionary learning models are based on linearly learned dictionaries, which have weak performance in nonlinear signal representation, thus driving a research boom in nonlinear dictionary learning (NLDL). In this paper, we propose a deep nonlinear dictionary learning model for dictionaries and coefficients with full-layer sparse regularization. It possesses the capability to acquire deep latent information and applying l_1 regularization enhances the efficiency of hierarchically extracting key features. Initially, we investigate the proposed algorithm using the proximal operator machine, followed by the introduction of Nesterov acceleration to expedite convergence, termed as \text Accelerated DNLDL _l_1. We validate the feasibility of the proposed algorithm through numerical experiments, demonstrating that the acceleration scheme enhances the algorithm's performance. We applied our proposed algorithm to practical image classification and denoising tasks to demonstrate its generality across various nonlinear functions. Additionally, experimental results show that applying regularization to both the dictionaries and coefficients simultaneously facilitates parameter tuning and yields superior denoising performance.
Related Material