PanopticRoad: Integrated Panoptic Road Segmentation Under Adversarial Conditions
Segmentation becomes one of the most important methods for scene understanding. Segmentation plays a central role in recognizing things and stuff in a scene. Among all things and stuff in a scene, the road guides vehicles in the cities and highways. Most segmentation models, i.e., semantic, instance, and panoptic segmentation, have focused on images with clear daytime weather conditions. Few papers have tackled nighttime vision under adversarial conditions, i.e., fog, rain, snow, strong illumination, and disaster events. Moreover, further segmentation of road conditions like dry, wet, and snow is still challenging under such invisible conditions. Weather impacts not only visibility but also roads and their surrounding environment, causing vital disasters with obstacles on the road, i.e., rocks and water. This paper proposes PanopticRoad with five Deep Learning-based modules for road condition segmentation under adversarial conditions: DeepReject/Scene/Snow/Depth/Road. Integration of them helps refine the failure of local road conditions where weather and physical constraints are applied. Using foggy and heavy snowfall nighttime road images and disaster images, the superiority of PanopticRoad is demonstrated over state-of-the-art panoptic-based and adaptive domain-based Deep Learning models in terms of stability, robustness, and accuracy.