Diffusion-Based Particle-DETR for BEV Perception

Asen Nachkov, Danda Pani Paudel, Martin Danelljan, Luc Van Gool; Proceedings of the Winter Conference on Applications of Computer Vision (WACV), 2025, pp. 2725-2735

Abstract


The Bird-Eye-View (BEV) is one of the most widely-used scene representations for visual perception in Autonomous Vehicles (AVs) due to its well suited compatibility to downstream tasks. For the enhanced safety of AVs modeling perception uncertainty in BEV is crucial. Recent diffusion-based methods offer a promising approach to uncertainty modeling for visual perception but fail to effectively detect small objects in the large coverage of the BEV. Such degradation of performance can be attributed primarily to the specific network architectures and the matching strategy used when training. Here we address this problem by combining the diffusion paradigm with current state-of-the-art 3D object detectors in BEV. We analyze the unique challenges of this approach which do not exist with deterministic detectors and present a simple technique based on object query interpolation that allows the model to learn positional dependencies even in the presence of the diffusion noise. Based on this we present a diffusion-based DETR model for object detection that bears similarities to particle methods. Abundant experimentation on the NuScenes dataset shows equal or better performance for our generative approach compared to deterministic state-of-the-art methods. The source code is at https://github.com/insait-institute/ParticleDETR.

Related Material


[pdf] [supp] [arXiv]
[bibtex]
@InProceedings{Nachkov_2025_WACV, author = {Nachkov, Asen and Paudel, Danda Pani and Danelljan, Martin and Van Gool, Luc}, title = {Diffusion-Based Particle-DETR for BEV Perception}, booktitle = {Proceedings of the Winter Conference on Applications of Computer Vision (WACV)}, month = {February}, year = {2025}, pages = {2725-2735} }