Robust 3D Cell Segmentation: Extending the View of Cellpose
Increasing data set sizes of digital microscopy imaging experiments demand for an automation of segmentation processes to be able to extract meaningful biomedical information. Due to the shortage of annotated 3D image data that can be used for machine learning-based approaches, 3D segmentation approaches are required to be robust and to generalize well to unseen data. Reformulating the problem of instance segmentation as a collection of diffusion gradient maps, proved to be such a generalist approach for cell segmentation tasks. In this paper, we extend the Cellpose approach to improve segmentation accuracy on 3D image data and we further show how the formulation of the gradient maps can be simplified while still being robust and reaching similar segmentation accuracy. We quantitatively compared different experimental setups and validated on two different data sets of 3D confocal microscopy images of A. thaliana.
READ FULL TEXT