0 avis
3D Nuclei Segmentation by Combining GAN Based Image Synthesis and Existing 3D Manual Annotations
Archive ouverte
Edité par CCSD ; SCITEPRESS -
International audience. Nuclei segmentation is an important task in cell analysis that requires accurate and reliable segmentation methods. In this context, deep learning based methods such as Stardist have emerged as the best performing solutions for segmenting nucleus. Unfortunately, using them in 3D requires life scientists to create new hand annotated data, a tedious task especially in presence of a crowded population with overlapping nuclei. In this work, we present a workflow to segment nuclei in 3D when no specific ground truth exists. Our workflow is composed of three steps: first, we use a pre-trained 2D model of Stardist to segment every frame of the 3D microscopy volume. We then train a conditional GAN with these paired microscopy and mask frames to transfer the microscopy style to the masks. This GAN is used to generate fluorescence volumes from existing ground truth 3D mask data. Finally, we train Stardist in 3D with these paired of synthetic volumes and 3D ground truth masks. We show that this strategy allows to segment data that have no available ground truth, improving the results obtained by training Stardist with the original ground truth data.