Abstract : This thesis is concerned with a complete navigation framework for mobile robots based on a representation of the environment as a sensory memory. The objective is to drive the robot from a place to another place along paths which have been travelled during a learning step. The memory structure, firstly detailled, is built from key images acquired during this teleoperated step. Given a target key image, the robot first localizes itself on the sensory memory and then a sensory route is extracted to reach the target image. Sensor-based control laws have been proposed for wheeled non-holonomic mobile robots and quadrotor aerial vehicles, taking into account the movement constraints of those robots. For omnidirectional or fisheye cameras, an efficient approach for initial localization has been proposed as well as tools to estimate the state of the vehicle required to compute the control laws outputs. The complete framework has been setup, with a particular attention to the memory management. Many applications on different robots and with different cameras validate our approach and show promising results.