Today’s maintenance tasks are time-consuming and therefore cost-intensive, in particular the manual inspection of commercial airliners. The joint project ”AI-Inspection Drone” aims to provide a complete process chain for the surface damage detection of airliners based on an unmanned aerial system (UAS). To achieve this goal, visual data is gathered, which can later be evaluated by artificial intelligence models. The process chain is explained in detail, beginning with the simulation of the hangar environment, followed by insights about the indoor navigation of the UAS. Finally, it is validated through test flights around the airplane, respecting strict security and safety requirements. The 3D-simulation based on Gazebo was expanded with hardware-in-the-loop testing functionality by utilizing a camera-based motion capture system to track the UAS’s position in real-time and feed the position data back into the simulation, to test different inspection tasks. For the deployment of the UAS in the hangar, a 3D-LiDAR based SLAM algorithm is used to provide position and orientation data in relation to the airplane. Using a 3D model, which can be gathered beforehand with LiDAR scans, the airplane’s surface area is estimated to determine the mission waypoints and the corresponding inspection views for a high-resolution camera. A path planning algorithm controls the procedure of the inspection by evaluating an efficient path based on these waypoints and enables obstacle avoidance based on LiDAR data. With the proposed autonomous aerial inspection platform, the ground time of airplanes can be reduced, thus increasing the efficiency of the airplane inspection process.
We present our latest work on designing a magnetically anchored wireless stereoscopic robot with 2 degrees of freedom (DOF) Pan-Tilt unit for single-port minimally invasive surgery (MIS). This camera could reduce the tool clashing issue in MIS and could provide better angulation and visualization of surgical field. After introduction of the robot through umbilicus (belly button), it is anchored to internal abdominal wall using a magnet from outside. Surgeon can change view angle of the camera remotely via a wireless joystick and a real-time stereo view will be displayed on a user interface screen. Since the robot is anchored using an external magnet on the abdominal wall during the surgical operation, surplus shocks and slight tremble of the robot will result in poor visualization. Therefore, we developed a real-time video stabilization scheme to eliminate these affects. Our proposed method uses a high frequency inertial measurement sensory data fused with visual optical flow vectors, extracted from the stereo camera, to estimate the unwanted shocks during the video streaming. This method compensates and stabilizes video streams in real-time by shifting the video images in the opposite direction of the estimated motion vector. We conducted several experiments including robot control, video streaming performance, and real-time video stabilization to investigate the system function. The results of these experiments are reported in this paper.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.