Learning the Effect of Registration Hyperparameters with HyperMorph

Andrew Hoopes10000-0002-7583-5972, Malte Hoffmann1,20000-0002-5511-0739, Douglas N. Greve1,2, Bruce Fischl1,2,30000-0002-2413-1115, John Guttag30000-0003-0992-0906, Adrian V. Dalca1,2,30000-0002-8422-0136
1: Massachusetts General Hospital, 2: Harvard Medical School, 3: Massachusetts Institute of Technology
Publication date: 2022/04/07
https://doi.org/10.59275/j.melba.2022-74f1
PDF · Code · Video · arXiv

Abstract

We introduce HyperMorph, a framework that facilitates efficient hyperparameter tuning in learning-based deformable image registration. Classical registration algorithms perform an iterative pair-wise optimization to compute a deformation field that aligns two images. Recent learning-based approaches leverage large image datasets to learn a function that rapidly estimates a deformation for a given image pair. In both strategies, the accuracy of the resulting spatial correspondences is strongly influenced by the choice of certain hyperparameter values. However, an effective hyperparameter search consumes substantial time and human effort as it often involves training multiple models for different fixed hyperparameter values and may lead to suboptimal registration. We propose an amortized hyperparameter learning strategy to alleviate this burden by learning the impact of hyperparameters on deformation fields. We design a meta network, or hypernetwork, that predicts the parameters of a registration network for input hyperparameters, thereby comprising a single model that generates the optimal deformation field corresponding to given hyperparameter values. This strategy enables fast, high-resolution hyperparameter search at test-time, reducing the inefficiency of traditional approaches while increasing flexibility. We also demonstrate additional benefits of HyperMorph, including enhanced robustness to model initialization and the ability to rapidly identify optimal hyperparameter values specific to a dataset, modality, task, or even anatomical region, all without the need to retrain models. We make our code publicly available at http://hypermorph.voxelmorph.net

Keywords

hyperparameter search · deformable image registration · deep learning · weight sharing · amortized learning · regularization · hypernetworks

Bibtex @article{melba:2022:003:hoopes, title = "Learning the Effect of Registration Hyperparameters with HyperMorph", author = "Hoopes, Andrew and Hoffmann, Malte and Greve, Douglas N. and Fischl, Bruce and Guttag, John and Dalca, Adrian V.", journal = "Machine Learning for Biomedical Imaging", volume = "1", issue = "IPMI 2021 special issue", year = "2022", pages = "1--30", issn = "2766-905X", doi = "https://doi.org/10.59275/j.melba.2022-74f1", url = "https://melba-journal.org/2022:003" }
RISTY - JOUR AU - Hoopes, Andrew AU - Hoffmann, Malte AU - Greve, Douglas N. AU - Fischl, Bruce AU - Guttag, John AU - Dalca, Adrian V. PY - 2022 TI - Learning the Effect of Registration Hyperparameters with HyperMorph T2 - Machine Learning for Biomedical Imaging VL - 1 IS - IPMI 2021 special issue SP - 1 EP - 30 SN - 2766-905X DO - https://doi.org/10.59275/j.melba.2022-74f1 UR - https://melba-journal.org/2022:003 ER -

2022:003 cover