Self-Supervised Deep Learning for Autonomous Vehicle Perception under Adverse Weather Conditions

Authors

  • Jakob Lindberg

Keywords:

Autonomous Vehicles, Self-Supervised Learning, Sensor Fusion, Adverse Weather, Perception Robustness.

Abstract

Autonomous vehicle (AV) perception relies on multi-sensor data integration from cameras, LiDAR, and radar to understand complex driving environments. However, perception accuracy declines significantly under adverse weather conditions such as fog, rain, and snow due to sensor degradation and environmental distortions. Traditional supervised deep learning approaches are limited by their dependence on large labeled datasets and poor generalization to unseen weather domains. This study proposes a self-supervised deep learning (SSL) framework for improving AV perception and sensor fusion performance without extensive labeled data. The framework combines contrastive learning and cross-modal reconstruction to learn weather-invariant representations, enhancing the resilience of perception systems in low-visibility scenarios. Experiments conducted using benchmark datasets (KITTI, nuScenes, and A*3D) and synthetic weather simulations demonstrate a substantial improvement in detection accuracy and robustness compared to supervised baselines. The proposed model achieved an average mAP gain of 14% and improved cross-weather generalization with minimal computational overhead. These results highlight the potential of SSL to reduce data dependency, improve safety, and enable scalable deployment of AV perception systems across diverse environmental conditions.

Downloads

Published

2025-12-26

Issue

Section

Articles