Matches in SemOpenAlex for { <https://semopenalex.org/work/W3037924247> ?p ?o ?g. }
- W3037924247 abstract "ABSTRACT Cell motility is a crucial biological function for many cell types, including the immune cells in our body that act as first responders to foreign agents. In this work we consider the amoeboid motility of human neutrophils, which show complex and continuous morphological changes during locomotion. We imaged live neutrophils migrating on a 2D plane and extracted unbiased shape representations using cell contours and binary masks. We were able to decompose these complex shapes into low-dimensional encodings with both principal component analysis (PCA) and an unsupervised deep learning technique using variational autoencoders (VAE), enhanced with generative adversarial networks (GANs). We found that the neural network architecture, the VAE-GAN, was able to encode complex cell shapes into a low-dimensional latent space that encodes the same shape variation information as PCA, but much more efficiently. Contrary to the conventional viewpoint that the latent space is a “black box”, we demonstrated that the information learned and encoded within the latent space is consistent with PCA and is reproducible across independent training runs. Furthermore, by including cell speed into the training of the VAE-GAN, we were able to incorporate cell shape and speed into the same latent space. Our work provides a quantitative framework that connects biological form, through cell shape, to a biological function, cell movement. We believe that our quantitative approach to calculating a compact representation of cell shape using the VAE-GAN provides an important avenue that will support further mechanistic dissection of cell motility. AUTHOR SUMMARY Deep convolutional neural networks have recently enjoyed a surge in popularity, and have found useful applications in many fields, including biology. Supervised deep learning, which involves the training of neural networks using existing labeled data, has been especially popular in solving image classification problems. However, biological data is often highly complex and continuous in nature, where prior labeling is impractical, if not impossible. Unsupervised deep learning promises to discover trends in the data by reducing its complexity while retaining the most relevant information. At present, challenges in the extraction of meaningful human-interpretable information from the neural network’s nonlinear discovery process have earned it a reputation of being a “black box” that can perform impressively well at prediction but cannot be used to shed any meaningful insight on underlying mechanisms of variation in biological data sets. Our goal in this paper is to establish unsupervised deep learning as a practical tool to gain scientific insight into biological data by first establishing the interpretability of our particular data set (images of the shapes of motile neutrophils) using more traditional techniques. Using the insight gained from this as a guide allows us to shine light into the “black box” of unsupervised deep learning." @default.
- W3037924247 created "2020-07-02" @default.
- W3037924247 creator A5025484390 @default.
- W3037924247 creator A5053975684 @default.
- W3037924247 creator A5063689593 @default.
- W3037924247 creator A5078436996 @default.
- W3037924247 date "2020-06-27" @default.
- W3037924247 modified "2023-10-17" @default.
- W3037924247 title "Quantitative comparison of principal component analysis and unsupervised deep learning using variational autoencoders for shape analysis of motile cells" @default.
- W3037924247 cites W1773256867 @default.
- W3037924247 cites W1873247595 @default.
- W3037924247 cites W1966746696 @default.
- W3037924247 cites W1978377738 @default.
- W3037924247 cites W1989571703 @default.
- W3037924247 cites W1995824548 @default.
- W3037924247 cites W2008063578 @default.
- W3037924247 cites W2012177884 @default.
- W3037924247 cites W2035538295 @default.
- W3037924247 cites W2047903964 @default.
- W3037924247 cites W2052842431 @default.
- W3037924247 cites W2075258506 @default.
- W3037924247 cites W2079830319 @default.
- W3037924247 cites W2089782191 @default.
- W3037924247 cites W2100495367 @default.
- W3037924247 cites W2114606812 @default.
- W3037924247 cites W2128350223 @default.
- W3037924247 cites W2132661331 @default.
- W3037924247 cites W2142254043 @default.
- W3037924247 cites W2145123926 @default.
- W3037924247 cites W2154723612 @default.
- W3037924247 cites W2362300820 @default.
- W3037924247 cites W2432567885 @default.
- W3037924247 cites W2548342201 @default.
- W3037924247 cites W2621150162 @default.
- W3037924247 cites W2750796620 @default.
- W3037924247 cites W2751625431 @default.
- W3037924247 cites W2797749376 @default.
- W3037924247 cites W2801396275 @default.
- W3037924247 cites W2901164501 @default.
- W3037924247 cites W2903685163 @default.
- W3037924247 cites W2921173483 @default.
- W3037924247 cites W2942285358 @default.
- W3037924247 cites W2946901414 @default.
- W3037924247 cites W2949493305 @default.
- W3037924247 cites W2950501364 @default.
- W3037924247 cites W2955305484 @default.
- W3037924247 cites W2978725006 @default.
- W3037924247 cites W3008914434 @default.
- W3037924247 cites W4206339135 @default.
- W3037924247 cites W3149760511 @default.
- W3037924247 doi "https://doi.org/10.1101/2020.06.26.174474" @default.
- W3037924247 hasPublicationYear "2020" @default.
- W3037924247 type Work @default.
- W3037924247 sameAs 3037924247 @default.
- W3037924247 citedByCount "11" @default.
- W3037924247 countsByYear W30379242472020 @default.
- W3037924247 countsByYear W30379242472021 @default.
- W3037924247 countsByYear W30379242472022 @default.
- W3037924247 countsByYear W30379242472023 @default.
- W3037924247 crossrefType "posted-content" @default.
- W3037924247 hasAuthorship W3037924247A5025484390 @default.
- W3037924247 hasAuthorship W3037924247A5053975684 @default.
- W3037924247 hasAuthorship W3037924247A5063689593 @default.
- W3037924247 hasAuthorship W3037924247A5078436996 @default.
- W3037924247 hasBestOaLocation W30379242471 @default.
- W3037924247 hasConcept C108583219 @default.
- W3037924247 hasConcept C14036430 @default.
- W3037924247 hasConcept C153180895 @default.
- W3037924247 hasConcept C154945302 @default.
- W3037924247 hasConcept C17744445 @default.
- W3037924247 hasConcept C186060115 @default.
- W3037924247 hasConcept C199539241 @default.
- W3037924247 hasConcept C27438332 @default.
- W3037924247 hasConcept C2776359362 @default.
- W3037924247 hasConcept C33923547 @default.
- W3037924247 hasConcept C41008148 @default.
- W3037924247 hasConcept C48372109 @default.
- W3037924247 hasConcept C78458016 @default.
- W3037924247 hasConcept C8038995 @default.
- W3037924247 hasConcept C81363708 @default.
- W3037924247 hasConcept C86803240 @default.
- W3037924247 hasConcept C94375191 @default.
- W3037924247 hasConcept C94625758 @default.
- W3037924247 hasConceptScore W3037924247C108583219 @default.
- W3037924247 hasConceptScore W3037924247C14036430 @default.
- W3037924247 hasConceptScore W3037924247C153180895 @default.
- W3037924247 hasConceptScore W3037924247C154945302 @default.
- W3037924247 hasConceptScore W3037924247C17744445 @default.
- W3037924247 hasConceptScore W3037924247C186060115 @default.
- W3037924247 hasConceptScore W3037924247C199539241 @default.
- W3037924247 hasConceptScore W3037924247C27438332 @default.
- W3037924247 hasConceptScore W3037924247C2776359362 @default.
- W3037924247 hasConceptScore W3037924247C33923547 @default.
- W3037924247 hasConceptScore W3037924247C41008148 @default.
- W3037924247 hasConceptScore W3037924247C48372109 @default.
- W3037924247 hasConceptScore W3037924247C78458016 @default.
- W3037924247 hasConceptScore W3037924247C8038995 @default.
- W3037924247 hasConceptScore W3037924247C81363708 @default.
- W3037924247 hasConceptScore W3037924247C86803240 @default.
- W3037924247 hasConceptScore W3037924247C94375191 @default.