We present a co-robotic ultrasound imaging system that tracks lesions undergoing physiological motions, e.g., breathing, using 2D B-mode images. The approach embeds the in-plane and out-of-plane transformation estimation in a proportional joint velocity controller to minimize the 6-degree-of-freedom (DoF) transformation error. Specifically, we propose a new method to estimate the out-of-plane translation using a convolutional neural network based on speckle decorrelation. The network is trained on anatomically featureless gray-scale B-mode images and is generalized to different tissue phantoms. The tracking algorithm is validated in simulation with mimicked respiratory motions, which demonstrates the feasibility of stabilizing biopsy through ultrasound guidance.
|