Hydra® is a modular autonomous rendezvous and docking (AR&D) sensor system for use on orbit. It uses multiple
sensor heads that feed data to a single processing module, allowing the system to be configured according to mission
needs. The modularity also decreases the required amount of external real estate, since the processing electronics can be
internal to spacecraft. Advanced Optical Systems has built an initial Hydra® prototype that includes an Advanced Video
Guidance Sensor (AVGS) and ULTOR® sensor head. The AVGS sensor head provides laser-based active measurement
of distance and orientation, while the ULTOR® sensor head provides passive measurement of the same. We have tested
the Hydra® prototype in the Marshall Space Flight Center's Flight Robotics Laboratory. In this paper we describe the
Hydra® prototype and present the results of ground testing the sensor system.
AOS is designing a modular AR&D System named Hydra® and building an initial prototype with selected (near-field and docking) capabilities and expansion capabilities to accommodate a time-of-flight and far-field sensor. Lessons learned from DART and Orbital Express have been applied to the proposed Hydra® design. The prototype Hydra® system design includes an AVGS sensor head and an ULTOR® sensor head. Although the initial Hydra® system is a ground demonstration unit, design methods and component selection provide a straightforward path for building a space-qualified Hydra® system. The basic architectural component for Hydra® is based on a common processing platform that can be configured to process inputs from a variety of sensors. The design consists of three elements: The sensor head or camera, which can be mounted external to the spacecraft; the processing electronics, which can be mounted internal to the spacecraft; and the Hydra® target, which is mounted on the target spacecraft at or near the docking interface.
KEYWORDS: Sensors, Head, Video, Video processing, Space operations, LIDAR, Field programmable gate arrays, Cameras, Signal to noise ratio, Imaging systems
Autonomous rendezvous and docking has become more prominent in the wake of the DART mission. In support of AR&D, NASA and companies such as ours have been developing sensors to measure distance, bearing and pose to target spacecraft. We are developing a suite of such sensors. The sensors include the Advanced Video Guidance Sensor (AVGS), the ULTOR video processor, and the Wide Angle Lidar for Direction and Distance (WALDD). AVGS is a laser-based video sensor that images retro-reflecting targets and extracts six-degree-of-freedom information. WALDD is a staring lidar system that provides range and bearing information using retro-reflecting targets. ULTOR is a video processor that can extract six-degree-of-freedom information from spacecraft that lack special targets. We will give an overview of the three sensors, their development, and their capabilities.
KEYWORDS: Hubble Space Telescope, Video, Sensors, Space telescopes, Robotics, Space operations, Satellites, Computer simulations, Video processing, Optical correlators
The tragic loss of Space Shuttle Columbia threw the future of Hubble Space Telescope (HST) in doubt. The Columbia Accident Investigation Board report led NASA to the realization that astronauts must have someplace to go on orbit if the Shuttle is damaged, a requirement that cannot be met for a manned HST mission. Yet missions to HST are required, since HST was designed to be serviced periodically.
To address this problem, NASA is developing a robotic servicing mission to Hubble. On-orbit rendezvous and docking under tele-robotic or fully autonomous control involves a number of challenges that have not been fully resolved. One key challenge is how to bring two craft together in precise alignment to each other without an experienced astronaut on board. For this to be possible, sensors are needed to report relative distance, bearing, and orientation.
At Advanced Optical Systems (AOS), we have applied our ULTOR digital correlation system to the Hubble repair mission. The ULTOR system operates at approximately 10 Hertz and can accurately determine the relative distance, bearing, and orientation needed for semi- or fully-autonomous docking to HST. The system can operate using the HST berthing target or other features, including the HST itself. It is small and light enough to be placed on the servicing craft, thus avoiding orbit-to-ground communication latency issues. We will discuss the results of our testing with computer-generated imagery of the HST and with any hardware-in-the-loop simulations.
Proximity operations between orbital vehicles require precise knowledge of relative navigation states. Retro-reflectors may be used by proximity operations navigation sensors as part of the navigation sensor system, to indicate the position of fixed points on one of the vehicles so that relative state data may be calculated. Use of corner cube retro-reflectors in an orbital navigation sensor required detailed ray-tracing analysis to define the expected return signal levels, signal/noise ratios, and predicted error effects due to reflector geometry and optical characteristics. Conventional corner cube reflector images would have displayed image artifacts due to corner cube bevels, interfering with software interpretation of sensor image data. This design avoided software errors due to bevel effects. Special optical design features were required to permit use of multiple target sets, enabling successful tracking over a 1 to 200 meter effective range. We have used this mounting scheme to create corner cube targets for use with the Advanced Video Guidance Sensor (AVGS) on the Orbital Express mission. We discuss our design, the finite-element analysis done on the design, and the results of sensor performance testing with the targets.
NASA has recently re-confirmed their interest in autonomous systems as an enabling technology for future missions. In order for autonomous missions to be possible, highly-capable relative sensor systems are needed to determine an object’s distance, direction, and orientation. This is true whether the mission is autonomous in-space assembly, rendezvous and docking, or rover surface navigation. Advanced Optical Systems, Inc. has developed a wide-angle laser range and bearing finder (RBF) for autonomous space missions.
The laser RBF has a number of features that make it well-suited for autonomous missions. It has an operating range of 10 m to 5 km, with a 5° field of view. Its wide field of view removes the need for scanning systems such as gimbals, eliminating moving parts and making the sensor simpler and space qualification easier. Its range accuracy is 1% or better; its bearing accuracy, 0.1°. It is designed to operate either as a stand-alone sensor or in tandem with a sensor that returns range, bearing, and orientation at close ranges, such as NASA’s Advanced Video Guidance Sensor. We have assembled the initial prototype and are currently testing it. We will discuss the laser RBF’s design and specifications.
In recent decades, NASA's interest in spacecraft rendezvous and proximity operations has grown. Additional instrumentation is needed to improve manned docking operations' safety, as well as to enable telerobotic operation of spacecraft or completely autonomous rendezvous and docking. To address this need, Advanced Optical Systems, Inc., Orbital Sciences Corporation, and Marshall Space Flight Center have developed the Advanced Video Guidance Sensor (AVGS) under the auspices of the Demonstration of Autonomous Rendezvous Technology (DART) program. Given a cooperative target comprising several retro-reflectors, AVGS provides six-degree-of-freedom information at ranges of up to 300 meters for the DART target. It does so by imaging the target, then performing pattern recognition on the resulting image. Longer range operation is possible through different target geometries.
Now that AVGS is being readied for its test flight in 2004, the question is: what next? Modifications can be made to AVGS, including different pattern recognition algorithms and changes to the retro-reflector targets, to make it more robust and accurate. AVGS could be coupled with other space-qualified sensors, such as a laser range-and-bearing finder, that would operate at longer ranges. Different target configurations, including the use of active targets, could result in significant miniaturization over the current AVGS package. We will discuss these and other possibilities for a next-generation docking sensor or sensor suite that involve AVGS.
Advanced Optical Systems, Inc. is developing the Autonomous Rendezvous and Docking Sensor Suite for Marshall Space Flight Center to provide real-time range and 6 Degree Of Freedom (DOF) information. This information facilitates the autonomous docking of two spacecraft. The sensor suite is comprised of the Advanced Video Guidance Sensor (AVGS) and the Wide Angle Laser Range Finder (WALRF). AVGS was developed under NASA's Demonstration of Autonomous Rendezvous Technology (DART) program for a cooperative target and is scheduled to fly in 2004. The prototype of the WALRF is being developed at AOS under a different program. The sensor suite can provide range and bearing data up to 5km and 6 DOF information up to 300m for the DART target configuration. Different target geometries can increase range detection and 6 DOF detection distance. The sensor suite is a laser-based optical system with a combined weight of less than 40lbs and a combined volume of less than 12”×10”×18”. The WALRF system employs a bistatic transceiver with an 8° field of view (FOV). This sensor is a time-of-flight range finder with a quad detector. The AVGS section of the suite is a monostatic transceiver with a 16° FOV and high-speed imager. This section of the suite uses a pattern recognition system that reduces imager data into 6 DOF information. In this paper we will outline in detail the AVGS and WLRF functionality as well as experimental range data and measurement accuracy.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.