Skip to Main content Skip to Navigation

Road scene perception based on fisheye camera, LIDAR and GPS data combination

Abstract : Road scene understanding is one of key research topics of intelligent vehicles. This thesis focuses on detection and tracking of obstacles by multisensors data fusion and analysis. The considered system is composed of a lidar, a fisheye camera and aglobal positioning system (GPS). Several steps of the perception scheme are studied: extrinsic calibration between fisheye camera and lidar, road detection and obstacles detection and tracking. Firstly, a new method for extinsic calibration between fisheye camera and lidar is proposed. For intrinsic modeling of the fisheye camera, three models of the literatureare studied and compared. For extrinsic calibration between the two sensors, the normal to the lidar plane is firstly estimated based on the determination of ń known ż points. The extrinsic parameters are then computed using a least square approachbased on geometrical constraints, the lidar plane normal and the lidar measurements. The second part of this thesis is dedicated to road detection exploiting both fisheye camera and lidar data. The road is firstly coarse detected considering the illumination invariant image. Then the normalised histogram based classification is validated using the lidar data. The road segmentation is finally refined exploiting two successive roaddetection results and distance map computed in HSI color space. The third step focuses on obstacles detection, especially in case of motion blur. The proposed method combines previously detected road, map, GPS and lidar information.Regions of interest are extracted from previously road detection. Then road central lines are extracted from the image and matched with road shape model extracted from 2DŋSIG map. Lidar measurements are used to validated the results.The final step is object tracking still using fisheye camera and lidar. The proposed method is based on previously detected obstacles and a region growth approach. All the methods proposed in this thesis are tested, evaluated and compared to stateŋofŋtheŋart approaches using real data acquired with the IRTESŋSET laboratory experimental platform.
Document type :
Complete list of metadata
Contributor : Abes Star :  Contact
Submitted on : Friday, October 27, 2017 - 4:45:09 PM
Last modification on : Tuesday, October 1, 2019 - 4:12:06 PM
Long-term archiving on: : Sunday, January 28, 2018 - 4:04:03 PM


Version validated by the jury (STAR)


  • HAL Id : tel-01625515, version 1



Yong Fang. Road scene perception based on fisheye camera, LIDAR and GPS data combination. Artificial Intelligence [cs.AI]. Université de Technologie de Belfort-Montbeliard, 2015. English. ⟨NNT : 2015BELF0265⟩. ⟨tel-01625515⟩



Record views


Files downloads