Automatic adaptive tracking in real-time for target recognition provided autonomous control of a scale model electric
truck. The two-wheel drive truck was modified as an autonomous rover test-bed for vision based guidance and
navigation. Methods were implemented to monitor tracking error and ensure a safe, accurate arrival at the intended
science target. Some methods are situation independent relying only on the confidence error of the target recognition
algorithm. Other methods take advantage of the scenario of combined motion and tracking to filter out anomalies. In
either case, only a single calibrated camera was needed for position estimation. Results from real-time autonomous
driving tests on the JPL simulated Mars yard are presented. Recognition error was often situation dependent. For the
rover case, the background was in motion and may be characterized to provide visual cues on rover travel such as rate,
pitch, roll, and distance to objects of interest or hazards. Objects in the scene may be used as landmarks, or waypoints,
for such estimations. As objects are approached, their scale increases and their orientation may change. In addition,
particularly on rough terrain, these orientation and scale changes may be unpredictable. Feature extraction combined
with the neural network algorithm was successful in providing visual odometry in the simulated Mars environment.
|