Augmenting X-ray imaging with 3D roadmap to improve guidance is a common strategy. Such approaches benefit from automated analysis of the X-ray images, such as the automatic detection and tracking of instruments. In this paper, we propose a real-time method to segment the catheter and guidewire in 2D X-ray fluoroscopic sequences. The method is based on deep convolutional neural networks. The network takes as input the current image and the three previous ones, and segments the catheter and guidewire in the current image. Subsequently, a centerline model of the catheter is constructed from the segmented image. A small set of annotated data combined with data augmentation is used to train the network. We trained the method on images from 182 X-ray sequences from 23 different interventions. On a testing set with images of 55 X-ray sequences from 5 other interventions, a median centerline distance error of 0.2 mm and a median tip distance error of 0.9 mm was obtained. The segmentation of the instruments in 2D X-ray sequences is performed in a real-time fully-automatic manner.

Additional Metadata
Keywords Catheter, Convolutional neural network, Deep learning, Fluoroscopy, Guidewire, Segmentation, Tracking, X-ray
Persistent URL dx.doi.org/10.1007/978-3-319-66185-8_65, hdl.handle.net/1765/102015
Series Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Rights No subscription
Citation
Ambrosini, P, Ruijters, D, Niessen, W.J, Moelker, A, & van Walsum, T.W. (2017). Fully automatic and real-time catheter segmentation in X-ray fluoroscopy. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). doi:10.1007/978-3-319-66185-8_65