Elgazzar, KhalidPatel, Dipkumar2022-03-082022-03-292022-03-082022-03-292022-02-01https://hdl.handle.net/10155/1423Road detection is a core component of self-driving vehicle perception, where it covers detecting road boundaries and drivable road regions. It can also help human drivers to drive safely in lower visibility. The majority of current road detection techniques use camera and lidar sensors. These sensors struggle in inclement weather conditions. MMwave radar works well in all weather conditions. However, due to the low resolution of the radar, it is currently limited to object detection for cruise control applications. This thesis investigates the impact of bad weather on vision-based systems and introduces a camera and radar-based method for efficient road detection. We propose a novel approach to overcome the sparse resolution of mmwave-radars and use it in the segmentation task. We augment the nuScenes dataset with fog and rain and use it for our validation. We achieve 20% and 18% better road boundary and drivable region detection in inclement weather.enRoad detectionAutonomous vehicleCamera radar fusionPerceptionMmwave radarDetecting road boundaries and drivable regions in challenging weather conditionsThesis