Purpose. We report the initial development of an image-based solution for robotic assistance of pelvic fracture fixation. The approach uses intraoperative radiographs, preoperative CT, and an end effector of known design to align the robot with target trajectories in CT. The method extends previous work to solve the robot-to-patient registration from a single radiographic view (without C-arm rotation) and addresses the workflow challenges associated with integrating robotic assistance in orthopaedic trauma surgery in a form that could be broadly applicable to isocentric or non-isocentric C-arms. Methods. The proposed method uses 3D-2D known-component registration to localize a robot end effector with respect to the patient by: (1) exploiting the extended size and complex features of pelvic anatomy to register the patient; and (2) capturing multiple end effector poses using precise robotic manipulation. These transformations, along with an offline hand-eye calibration of the end effector, are used to calculate target robot poses that align the end effector with planned trajectories in the patient CT. Geometric accuracy of the registrations was independently evaluated for the patient and the robot in phantom studies. Results. The resulting translational difference between the ground truth and patient registrations of a pelvis phantom using a single (AP) view was 1.3 mm, compared to 0.4 mm using dual (AP+Lat) views. Registration of the robot in air (i.e., no background anatomy) with five unique end effector poses achieved mean translational difference ~1.4 mm for K-wire placement in the pelvis, comparable to tracker-based margins of error (commonly ~2 mm). Conclusions. The proposed approach is feasible based on the accuracy of the patient and robot registrations and is a preliminary step in developing an image-guided robotic guidance system that more naturally fits the workflow of fluoroscopically guided orthopaedic trauma surgery. Future work will involve end-to-end development of the proposed guidance system and assessment of the system with delivery of K-wires in cadaver studies.
Purpose. Fracture reduction is a challenging part of orthopaedic pelvic trauma procedures, resulting in poor long-term prognosis if reduction does not accurately restore natural morphology. Manual preoperative planning is performed to obtain target transformations of target bones – a process that is challenging and time-consuming even to experts within the rapid workflow of emergent care and fluoroscopically guided surgery. We report a method for fracture reduction planning using a novel image-based registration framework. Method. An objective function is designed to simultaneously register multi-body bone fragments that are preoperatively segmented via a graph-cut method to a pelvic statistical shape model (SSM) with inter-body collision constraints. An alternating optimization strategy switches between fragments alignment and SSM adaptation to solve for the fragment transformations for fracture reduction planning. The method was examined in a leave-one-out study performed over a pelvic atlas with 40 members with two-body and three-body fractures simulated in the left innominate bone with displacements ranging 0–20 mm and 0°–15°. Result. Experiments showed the feasibility of the registration method in both two-body and three-body fracture cases. The segmentations achieved Dice coefficient of median 0.94 (0.01 interquartile range [IQR]) and root mean square error (RMSE) of 2.93 mm (0.56 mm IQR). In two-body fracture cases, fracture reduction planning yielded 3.8 mm (1.6 mm IQR) translational and 2.9° (1.8° IQR) rotational error. Conclusion. The method demonstrated accurate fracture reduction planning within 5 mm and shows promise for future generalization to more complicated fracture cases. The algorithm provides a novel means of planning from preoperative CT images that are already acquired in standard workflow.
Purpose: Metal artifacts remain a challenge for CBCT systems in diagnostic imaging and image-guided surgery, obscuring visualization of metal instruments and surrounding anatomy. We present a method to predict C-arm CBCT orbits that will avoid metal artifacts by acquiring projection data that is least affected by polyenergetic bias. Methods: The metal artifact avoidance (MAA) method operates with a minimum of prior information, is compatible with simple mobile C-arms that are increasingly prevalent in routine use, and is consistent with either 3D filtered backprojection (FBP), more advanced (polyenergetic) model-based image reconstruction (MBIR), and/or metal artifact reduction (MAR) post-processing methods. MAA consists of the following steps: (i) coarse localization of metal objects in the field of view (FOV) via two or more low-dose scout views, coarse backprojection, and segmentation (e.g., with a U-Net); (ii) a simple model-based prediction of metal-induced x-ray spectral shift for all source-detector vertices (gantry rotation and tilt angles) accessible by the imaging system; and (iii) definition of a source-detector orbit that minimizes the view-to-view inconsistency in spectral shift. The method was evaluated in anthropomorphic phantom study emulating pedicle screw placement in spine surgery. Results: Phantom studies confirmed that the MAA method could accurately predict tilt angles that minimize metal artifacts. The proposed U-Net segmentation method was able to localize complex distributions of metal instrumentation (over 70% Dice coefficient) with 6 low-dose scout projections acquired during routine pre-scan collision check. CBCT images acquired at MAA-prescribed tilt angles demonstrated ~50% reduction in “blooming” artifacts (measured as FWHM of the screw shaft). Geometric calibration for tilted orbits at prescribed angular increments with interpolation for intermediate values demonstrated accuracy comparable to non-tilted circular trajectories in terms of the modulation transfer function. Conclusion: The preliminary results demonstrate the ability to predict C-arm orbits that provide projection data with minimal spectral bias from metal instrumentation. Such orbits exhibit strongly reduced metal artifacts, and the projection data are compatible with additional post-processing (metal artifact reduction, MAR) methods to further reduce artifacts and/or reduce noise. Ongoing studies aim to improve the robustness of metal object localization from scout views and investigate additional benefits of non-circular C-arm trajectories.
Pelvic trauma surgical procedures rely heavily on guidance with 2D fluoroscopy views for navigation in complex bone corridors. This “fluoro-hunting” paradigm results in extended radiation exposure and possible suboptimal guidewire placement from limited visualization of the fractures site with overlapped anatomy in 2D fluoroscopy. A novel computer visionbased navigation system for freehand guidewire insertion is proposed. The navigation framework is compatible with the rapid workflow in trauma surgery and bridges the gap between intraoperative fluoroscopy and preoperative CT images. The system uses a drill-mounted camera to detect and track poses of simple multimodality (optical/radiographic) markers for registration of the drill axis to fluoroscopy and, in turn, to CT. Surgical navigation is achieved with real-time display of the drill axis position on fluoroscopy views and, optionally, in 3D on the preoperative CT. The camera was corrected for lens distortion effects and calibrated for 3D pose estimation. Custom marker jigs were constructed to calibrate the drill axis and tooltip with respect to the camera frame. A testing platform for evaluation of the navigation system was developed, including a robotic arm for precise, repeatable, placement of the drill. Experiments were conducted for hand-eye calibration between the drill-mounted camera and the robot using the Park and Martin solver. Experiments using checkerboard calibration demonstrated subpixel accuracy [−0.01 ± 0.23 px] for camera distortion correction. The drill axis was calibrated using a cylindrical model and demonstrated sub-mm accuracy [0.14 ± 0.70 mm] and sub-degree angular deviation.