Interactive Cell Segmentation based on Active and Semi-supervised Learning

Author(s): Hang Su, Zhaozheng Yin, Seungil Huh, Takeo Kanade, and Jun Zhu
Date of Publication: March 2016

Automatic cell segmentation can hardly be flawless due to the complexity of image data particularly when time-lapse experiments last for a long time without biomarkers. To address this issue, we propose an interactive cell segmentation method by classifying feature-homogeneous superpixels into specific classes, which is guided by human interventions. Specifically, we propose to actively select the most informative superpixels by minimizing the expected prediction error which is upper bounded by the transductive Rademacher complexity, and then query for human annotations. After propagating the user-specified labels to the remaining unlabeled superpixels via an affinity graph, the errorprone superpixels are selected automatically and request for human verification on them; once erroneous segmentation is detected and subsequently corrected, the information is propagated efficiently over a gradually-augmented graph to un-labeled superpixels such that the analogous errors are fixed meanwhile. The correction propagation step is efficiently conducted by introducing a verification propagation matrix rather than rebuilding the affinity graph and re-performing the label propagation from the beginning. We repeat this procedure until most superpixels are classified into a specific category with high confidence. Experimental results performed on three types of cell populations validate that our interactive cell segmentation algorithm quickly reaches high quality results with minimal human interventions and is significantly more efficient than alternative methods, since the most informative samples are selected for human annotation/verification early.

Citation: Su, H., Yin, Z., Huh, S., Kanade, T., & Zhu, J. (2016). Interactive cell segmentation based on active and semi-supervised learning. IEEE Transactions on Medical Imaging, 35(3), 762-777.
Team(s): Plant Team