Integrating Semi-supervised and Supervised Learning Methods for Label Fusion in Multi-Atlas Based Image Segmentation

A novel label fusion method for multi-atlas based image segmentation method is developed by integrating semi-supervised and supervised machine learning techniques. Particularly, our method is developed in a pattern recognition based multi-atlas label fusion framework. We build random forests classif...

Full description

Bibliographic Details
Main Authors: Qiang Zheng, Yihong Wu, Yong Fan
Format: Article
Language:English
Published: Frontiers Media S.A. 2018-10-01
Series:Frontiers in Neuroinformatics
Subjects:
Online Access:https://www.frontiersin.org/article/10.3389/fninf.2018.00069/full
Description
Summary:A novel label fusion method for multi-atlas based image segmentation method is developed by integrating semi-supervised and supervised machine learning techniques. Particularly, our method is developed in a pattern recognition based multi-atlas label fusion framework. We build random forests classification models for each image voxel to be segmented based on its corresponding image patches of atlas images that have been registered to the image to be segmented. The voxelwise random forests classification models are then applied to the image to be segmented to obtain a probabilistic segmentation map. Finally, a semi-supervised label propagation method is adapted to refine the probabilistic segmentation map by propagating its reliable voxelwise segmentation labels, taking into consideration consistency of local and global image appearance of the image to be segmented. The proposed method has been evaluated for segmenting the hippocampus in MR images and compared with alternative machine learning based multi-atlas based image segmentation methods. The experiment results have demonstrated that our method could obtain competitive segmentation performance (average Dice index > 0.88), compared with alternative multi-atlas based image segmentation methods under comparison. Source codes of the methods under comparison are publicly available at www.nitrc.org/frs/?group_id=1242.
ISSN:1662-5196