In image registration the optimal transformation parameters of a given transformation model are typically obtained by minimizing a cost function. Stochastic gradient descent (SGD) is an efficient optimization algorithm for image registration. In SGD optimization, stochastic approximations of the cost function derivative are used in each iteration to update the transformation parameters. The stochastic approximation error leads to large variance in the parameters. To enforce convergence nonetheless, SGD methods are typically implemented in combination with a gradually decreasing update step size. However, selecting a proper sequence of step sizes is a major challenge in practice. An alternative strategy in numerical optimization is to use a constant step size and enforce convergence by averaging the parameters obtained by SGD over several iterations. It was proven mathematically that the highest possible rate of convergence is achieved in this way. Inspired by this work, we propose an averaged SGD (Avg-SGD) method for efficient image registration. In the Avg-SGD approach, a constant step size is used, in combination with an exponentially weighted iterate averaging scheme. Experiments on 3D lung CT scans demonstrate the effectiveness of the Avg-SGD method in terms of convergence rate, accuracy and precision.

Additional Metadata
Persistent URL dx.doi.org/10.1007/978-3-319-92258-4_7, hdl.handle.net/1765/108940
Series Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Rights no subscription
Citation
Sun, W. (Wei), Poot, D.H.J, Yang, X, Niessen, W.J, & Klein, S. (2018). Averaged stochastic optimization for medical image registration based on variance reduction. In Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics). doi:10.1007/978-3-319-92258-4_7