DeepSynth: Three-dimensional nuclear segmentation of biological images using neural networks trained with synthetic data

Kenneth W. Dunn, Chichen Fu, David Joon Ho, Soonam Lee, Shuo Han, Paul Salama, Edward J. Delp

Research output: Contribution to journalArticle

1 Scopus citations

Abstract

The scale of biological microscopy has increased dramatically over the past ten years, with the development of new modalities supporting collection of high-resolution fluorescence image volumes spanning hundreds of microns if not millimeters. The size and complexity of these volumes is such that quantitative analysis requires automated methods of image processing to identify and characterize individual cells. For many workflows, this process starts with segmentation of nuclei that, due to their ubiquity, ease-of-labeling and relatively simple structure, make them appealing targets for automated detection of individual cells. However, in the context of large, three-dimensional image volumes, nuclei present many challenges to automated segmentation, such that conventional approaches are seldom effective and/or robust. Techniques based upon deep-learning have shown great promise, but enthusiasm for applying these techniques is tempered by the need to generate training data, an arduous task, particularly in three dimensions. Here we present results of a new technique of nuclear segmentation using neural networks trained on synthetic data. Comparisons with results obtained using commonly-used image processing packages demonstrate that DeepSynth provides the superior results associated with deep-learning techniques without the need for manual annotation.

Original languageEnglish (US)
Article number18295
JournalScientific reports
Volume9
Issue number1
DOIs
StatePublished - Dec 1 2019

ASJC Scopus subject areas

  • General

Fingerprint Dive into the research topics of 'DeepSynth: Three-dimensional nuclear segmentation of biological images using neural networks trained with synthetic data'. Together they form a unique fingerprint.

  • Cite this