Perceptual navigation around a sensory-motor trajectory
PhD thesis of Cedric Pradalier (2004)
Autonomous navigation of a mobile robot has been a widely studied problem in the robotic community. Most robots designed for this task are equipped with on-board sensor(s) to perceive external world (sonars, laser telemeters, camera).
Then, two main kinds of approach to autonomous navigation have been proposed: reactive navigation where the robot uses only current perception to move and explore without colliding and servoed navigation where the robot is given a pre-planned reference trajectory and use some closed-loop control law to follow its reference trajectory. Among servoed navigations, two classes of approaches can again be separated: state space tracking and perception space tracking. State space tracking implies two specificities: first to be given a reference trajectory in the state space, and second to be able to localize the robot, also in the state space. Conversely, perception space tracking implies that trajectory is defined with respect to perception only, hence avoiding the need for global localization. A specific application of perception tracking is visual servoing, classically implemented as the convergence of observed image to a fixed reference image.
In this work, we are specifically interested in the case of perceptual tracking of a perceptual trajectory with a mobile robot. We assume that, i) reference trajectory is defined as a sequence of observations perceived by an on-board sensor along robot movement; ii) no localization system (neither GPS nor landmark based) is available to perform tracking. This situation is interesting for at least three reasons: firstly, since trajectory is not defined with respect to a Cartesian frame we don’t need to deal with the complex task of global localization, secondly this kind of trajectory can be naturally and easily learned from examples, and thirdly, it can be seen has an hypothesis on how biologic entities memorize and represent paths.
This behavioral replay of a sensori-motor trajectory has been completely modelized using Bayesian programming. This model addresses temporal and spatial localization, control generation, obstacle avoidance and failure diagnosis in an integrated framework.
It was successfully implemented on both a simulated robot and on a car-like autonomous vehicle (CyCab) on the car park area of our institute.