One of the most challenging problems in image analysis is the automated analysis of dynamic imaging data, in particular the tracking and quantification of multiple objects in dense environments — such as cell or organ cultures monitored over time. Cells migrate, divide and die, overlapping at times or staying in contact with each other, thus requiring time consuming manual curation of large datasets.
The challenges in such an analysis are:
— the variability of objects of interest, both within an image, across different images and samples, as well as they evolved in time, and hence the characteristic properties the algorithms should look for when tracking or quantifying them.
— the significant overlap and similarity between adjacent objects in dense two-dimensional or three-dimensional cultures and hence the difficulty in delineating individual ones.
— the occurrence of cell division and cell death events that requires classification of events, generation of lineages of cells and may confound tracking of original objects.
— the computational burden when dealing with multi-dimensional datasets (e.g., time, 3D coordinates and spectroscopic information ) that can reach several terabytes of information per experiment.
Dynamic image analysis, in particular the automated segmentation, tracking and annotation of cells and cellular events, is a hot topic in imaging research right now. There are many endeavours coming from different groups (mathematicians, engineers, computational biology, computer sciences, …) with promising results, see for instance [Neumann et al. 2010, Amat et al. 2015]. However it is necessary to improve accuracy and reliability of such techniques, extend their applicability to images of biological samples lacking specific stains for tracking and to large multi-modality images . This effort is necessary to for effective applications on biomedical problems.
In this project we will develop mathematical analysis combined with machine learning techniques that are capable of an automated, both reliable and computationally feasible analysis of dynamic microscopy imaging data. In particular, we will investigate how supervised learning from manually contoured and tracked cells can be combined with informed physical models that encode biological models of cell dynamics such as elasticity of cell walls, mutual repulsion forces of neighbouring cells, migratory behaviours and cell fate choices in order to improve analysis of complex biomedical images
[Neumann et al. 2010] Neumann, Beate, et al. “Phenotypic profiling of the human genome by time-lapse microscopy reveals cell division genes.” Nature 464.7289 (2010): 721-727.
[Amat et al. 2015] Amat, Fernando, et al. “Efficient processing and analysis of large-scale light-sheet microscopy data.” Nature protocols 10.11 (2015): 1679-1696.