The Centrifuge Rotor (CR) is a large life science experiment facility which will be installed in the International Space Station (ISS). It will provide artificial gravity of 2g or less by rotating up to 4 science habitats, and it will be the first such machinery to be used in space. To prevent vibration disturbance exchanges between the CR and the ISS, a soft 5 dof vibration isolation mechanism is used which cannot support the CR weight on the ground. Therefore, the CR on-orbit performance must be predicted by integrated analysis which must model all of the equipment including sensors, actuators, flexible structure, gyroscopic effects, and controllers. Here, we introduce the CR mechatronics, a verification procedure, and examples of the application of the integrated analysis which is based on the general-purpose mechanism analysis software ADAMS.
This paper describes an interactive 3-D display system for supporting image-guided surgery. Different from conventional CRT-based medical display systems, this one can provide true 3- D images of the patient's anatomical structures in a physical 3-D space. Furthermore, various tools for view control, target definition, and simple treatment simulation, have been developed and can be used for directly manipulating these images. This feature is very useful for a surgeon to intuitively recognize the precise position of a lesion and other structures and to plan a more accurate treatment. The hardware system is composed of a volume scanning 3-D display for 3-D real image presentation, a 3-D wireless mouse for direct manipulation in a 3-D space, and a workstation for the data control of these devices. The software is for analyzing X-CT, MRI, or SPECT images and for organizing the tools for treatment planning. The system is currently aimed at being used for stereotactic neurosurgical operations.
KEYWORDS: 3D displays, 3D image processing, Relays, Mirrors, Light emitting diodes, Stereoscopic displays, Data modeling, Medical imaging, Free space optics, Distortion
This paper describes a newly developed volume scanning display in which a user can actually put his/her hands into the 3D image and manipulate it without unencumbering devices, such as goggles, glasses or gloves. This performance has never been achieved by conventional display systems. The display is composed of a volume scanning LED panel for creating an autostereoscopic image, an optical relay system for translating the image into another free space, and a wireless 3D mouse for a user to interact with the image. The display has been applied to shape modeling, physical simulation data visualization, and medical data imaging.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.