-
[pdf]
[supp]
[bibtex]@InProceedings{Gorgun_2025_ICCV, author = {G\"org\"un, Ada and Schiele, Bernt and Fischer, Jonas}, title = {VITAL: More Understandable Feature Visualization through Distribution Alignment and Relevant Information Flow}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)}, month = {October}, year = {2025}, pages = {4403-4412} }
VITAL: More Understandable Feature Visualization through Distribution Alignment and Relevant Information Flow
Abstract
Neural networks are widely adopted to solve complex and challenging tasks. Especially in high-stakes decision-making, understanding their reasoning process is crucial, yet proves challenging for modern deep networks. Feature visualization (FV) is a powerful tool to decode what information neurons are responding to and hence to better understand the reasoning behind such networks. In particular, in FV we generate human-understandable images that reflect the information detected by neurons of interest. However, current methods often yield unrecognizable visualizations, exhibiting repetitive patterns and visual artifacts that are hard to understand for a human. To address these problems, we propose to guide FV through **statistics of real image features** combined with measures of **relevant network flow** to generate prototypical images. Our approach yields human-understandable visualizations that both qualitatively and quantitatively improve over state-of-the-art FVs across various architectures. As such, it can be used to decode **which** information the network uses, complementing mechanistic circuits that identify **where** it is encoded.
Related Material
