We report an integrative unsupervised deep learning approach to translate the complex morphological information of cells into interpretable representations that can be generalizable for downstream single-cell analytics. The method, integrating the respective advantages of generative adversarial network and variational autoencoder, enables faithful prediction of biological processes based on cell morphology read out from different imaging modalities. We demonstrate the generalizability and scalability of this method in a diverse range of applications, including cellular responses to SARS-CoV-2 infection, cell-cycle progression imaged by high-throughput quantitative phase imaging (QPI), and cellular changes during epithelial to mesenchymal transition (EMT) captured by fluorescence imaging.
|