2016 JRNL PP Yuchuan Qiao - 1.pdf
Terbatas  Irwan Sofiyan
» Gedung UPT Perpustakaan
Terbatas  Irwan Sofiyan
» Gedung UPT Perpustakaan
Fast automatic image registration is an important
prerequisite for image-guided clinical procedures. However, due
to the large number of voxels in an image and the complexity of
registration algorithms, this process is often very slow. Stochastic
gradient descent is a powerful method to iteratively solve the registration
problem, but relies for convergence on a proper selection
of the optimization step size. This selection is difficult to perform
manually, since it depends on the input data, similarity measure
and transformation model. The Adaptive Stochastic Gradient
Descent (ASGD) method is an automatic approach, but it comes
at a high computational cost. In this paper, we propose a new
computationally efficient method (fast ASGD) to automatically
determine the step size for gradient descent methods, by considering
the observed distribution of the voxel displacements between
iterations. A relation between the step size and the expectation
and variance of the observed distribution is derived. While ASGD
has quadratic complexity with respect to the transformation
parameters, fast ASGD only has linear complexity. Extensive
validation has been performed on different datasets with different
modalities, inter/intra subjects, different similarity measures and
transformation models. For all experiments, we obtained similar
accuracy as ASGD. Moreover, the estimation time of fast ASGD is
reduced to a very small value, from 40 s to less than 1 s when the
number of parameters is 105, almost 40 times faster. Depending
on the registration settings, the total registration time is reduced
by a factor of 2.5–7 for the experiments in this paper.