-
[pdf]
[bibtex]@InProceedings{Javidnia_2025_ICCV, author = {Javidnia, Hossein}, title = {Task-Driven Neural Adaptive Gain Control: 16-bit-to-8-bit Thermal Tone Mapping for Superior Object Detection}, booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV) Workshops}, month = {October}, year = {2025}, pages = {2183-2191} }
Task-Driven Neural Adaptive Gain Control: 16-bit-to-8-bit Thermal Tone Mapping for Superior Object Detection
Abstract
Thermal cameras are indispensable for autonomous driving in darkness, glare, and adverse weather, yet their 16-bit raw outputs exhibit low contrast and frame-to-frame intensity drift that hamper modern detectors. We introduce Neural Adaptive Gain Control (N-AGC), a lightweight U-Net that learns per-pixel gain and offset maps together with a global bias, converting raw 16-bit frames into detector-ready 8-bit images in a single forward pass. N-AGC is trained in two stages. (i) Tone-curve warmup: pixel-wise L1 loss plus differentiable moment matching aligns the stretched 16-bit histogram with the vendor 8-bit output (via per-frame percentile stretch), achieving a 0.92 histogram-intersection score while preserving detail. (ii) Task-driven fine-tuning: a frozen Faster R-CNN head backpropagates detection loss through N-AGC, sharpening contrast exactly where the detector benefits most. On the FLIR ADAS v1+v2 benchmark our approach lifts thermal AP50 from the official YOLOX-m baseline (75.33% person, 77.23% car) to 90.1% and 86.3%, respectively, while adding < 5ms latency on a single RTX 5000 Ada Generation. N-AGC is model-agnostic, requires no manual gain tuning, and drops seamlessly into existing perception stacks, making it a practical upgrade for night-time and all-weather autonomous navigation. Code and pretrained models are available at https://github.com/hosseinjavidnia/NAGC.
Related Material
