AdaV: Autonomous Driving in Adverse Visibility

Multimodal Sensor Fusion for Reliable Perception in All Weather Conditions

The AdaV project, financed by the ANR JCJC, addresses a core limitation of current autonomous driving systems: their reduced performance in adverse weather conditions such as fog, rain, and snow. These conditions impair the visibility and reliability of conventional sensors. AdaV proposes a multimodal perception system combining RGB cameras, LiDAR, RADAR, IMU, and polarimetric cameras. The goal is to improve robustness and reliability of object detection by adapting the sensor fusion process to environmental conditions.


Objectives

  • Develop adaptive fusion methods for object detection in low-visibility scenes.
  • Quantify sensor reliability based on environmental conditions.
  • Propose uncertainty-aware deep learning models based on Dempster-Shafer Theory.
  • Construct and validate a new multimodal dataset with diverse conditions.

Impact

  • Enhanced perception for autonomous vehicles in real-world, low-visibility settings.
  • Contributions to AI-based uncertainty modeling and multimodal deep learning.
  • Public dataset and open research tools for the ADAS community.

Dataset

Papers & Contributions

Our Team

dsi-wordpress Avatar
LAGHMARA Hind

Associate Professor at INSA Rouen Normandy

dsi-wordpress Avatar
BOUTTEAU Rémi

Professor at University of Rouen Normandy

dsi-wordpress Avatar
HONEINE Paul

Professor at University of Rouen Normandy

dsi-wordpress Avatar
PLANTEROSE Elsa

Research Engineer at INSA Rouen Normandy

dsi-wordpress Avatar
CANU Stéphane

Professor at INSA Rouen Normandy

dsi-wordpress Avatar
AINOUZ Samia

Professor at INSA Rouen Normandy

dsi-wordpress Avatar
BEN SLIMANE Eya

Research Engineer at INSA Rouen Normandy

Partners & Sponsors