We present a three-dimensional bioimage analysis workflow to quantitatively analyze single, actin-stained cells with filopodial protrusions of diverse structural and temporal attributes, such as number, length, thickness, level of branching, and lifetime, in time-lapse confocal microscopy image data. Our workflow makes use of convolutional neural networks trained using real as well as synthetic image data, to segment the cell volumes with highly heterogeneous fluorescence intensity levels and to detect individual filopodial protrusions, followed by a constrained nearest-neighbor tracking algorithm to obtain valuable information about the spatio-temporal evolution of individual filopodia. We validated the workflow using real and synthetic 3D time-lapse sequences of lung adenocarcinoma cells of three morphologically distinct filopodial phenotypes and show that it achieves reliable segmentation and tracking performance, providing a robust, reproducible and less time-consuming alternative to manual analysis of the 3D+t image data.

3D skeletonization, actin cytoskeleton, Biomedical imaging, Cancer, Chan-Vese model, confocal microscopy, convolutional neural network, deep learning, Filopodium segmentation and tracking, Fluorescence, Image segmentation, Manuals, Microscopy, Three-dimensional displays
dx.doi.org/10.1109/TMI.2018.2873842, hdl.handle.net/1765/110938
IEEE Transactions on Medical Imaging
no subscription
Biomedical Imaging Group Rotterdam

Castillam, C. (Carlos), Maskam, M. (Martin), Sorokin, D.V. (Dmitry V.), Meijering, H.W, & Ortiz-De-Solorzano, C. (2018). Three-Dimensional Quantification of Filopodia in Motile Cancer Cells. IEEE Transactions on Medical Imaging. doi:10.1109/TMI.2018.2873842