Matches in SemOpenAlex for { <https://semopenalex.org/work/W3157651102> ?p ?o ?g. }
- W3157651102 abstract "Introduction: Research related to the automatic detection of Alzheimer's disease (AD) is important, given the high prevalence of AD and the high cost of traditional diagnostic methods. Since AD significantly affects the content and acoustics of spontaneous speech, natural language processing, and machine learning provide promising techniques for reliably detecting AD. There has been a recent proliferation of classification models for AD, but these vary in the datasets used, model types and training and testing paradigms. In this study, we compare and contrast the performance of two common approaches for automatic AD detection from speech on the same, well-matched dataset, to determine the advantages of using domain knowledge vs. pre-trained transfer models. Methods: Audio recordings and corresponding manually-transcribed speech transcripts of a picture description task administered to 156 demographically matched older adults, 78 with Alzheimer's Disease (AD) and 78 cognitively intact (healthy) were classified using machine learning and natural language processing as AD or non-AD. The audio was acoustically-enhanced, and post-processed to improve quality of the speech recording as well control for variation caused by recording conditions. Two approaches were used for classification of these speech samples: (1) using domain knowledge: extracting an extensive set of clinically relevant linguistic and acoustic features derived from speech and transcripts based on prior literature, and (2) using transfer-learning and leveraging large pre-trained machine learning models: using transcript-representations that are automatically derived from state-of-the-art pre-trained language models, by fine-tuning Bidirectional Encoder Representations from Transformer (BERT)-based sequence classification models. Results: We compared the utility of speech transcript representations obtained from recent natural language processing models (i.e., BERT) to more clinically-interpretable language feature-based methods. Both the feature-based approaches and fine-tuned BERT models significantly outperformed the baseline linguistic model using a small set of linguistic features, demonstrating the importance of extensive linguistic information for detecting cognitive impairments relating to AD. We observed that fine-tuned BERT models numerically outperformed feature-based approaches on the AD detection task, but the difference was not statistically significant. Our main contribution is the observation that when tested on the same, demographically balanced dataset and tested on independent, unseen data, both domain knowledge and pretrained linguistic models have good predictive performance for detecting AD based on speech. It is notable that linguistic information alone is capable of achieving comparable, and even numerically better, performance than models including both acoustic and linguistic features here. We also try to shed light on the inner workings of the more black-box natural language processing model by performing an interpretability analysis, and find that attention weights reveal interesting patterns such as higher attribution to more important information content units in the picture description task, as well as pauses and filler words. Conclusion: This approach supports the value of well-performing machine learning and linguistically-focussed processing techniques to detect AD from speech and highlights the need to compare model performance on carefully balanced datasets, using consistent same training parameters and independent test datasets in order to determine the best performing predictive model." @default.
- W3157651102 created "2021-05-10" @default.
- W3157651102 creator A5010008842 @default.
- W3157651102 creator A5050825493 @default.
- W3157651102 creator A5055053084 @default.
- W3157651102 creator A5056256317 @default.
- W3157651102 creator A5079735833 @default.
- W3157651102 date "2021-04-27" @default.
- W3157651102 modified "2023-10-06" @default.
- W3157651102 title "Comparing Pre-trained and Feature-Based Models for Prediction of Alzheimer's Disease Based on Speech" @default.
- W3157651102 cites W1853705225 @default.
- W3157651102 cites W1974288165 @default.
- W3157651102 cites W1980794267 @default.
- W3157651102 cites W2016690753 @default.
- W3157651102 cites W2023736093 @default.
- W3157651102 cites W2053069220 @default.
- W3157651102 cites W2056323522 @default.
- W3157651102 cites W2073542686 @default.
- W3157651102 cites W2074037951 @default.
- W3157651102 cites W2089109585 @default.
- W3157651102 cites W2091298722 @default.
- W3157651102 cites W2112224561 @default.
- W3157651102 cites W2132559335 @default.
- W3157651102 cites W2137531761 @default.
- W3157651102 cites W2158516192 @default.
- W3157651102 cites W2251818548 @default.
- W3157651102 cites W2594894191 @default.
- W3157651102 cites W2792449839 @default.
- W3157651102 cites W2808361131 @default.
- W3157651102 cites W2884001105 @default.
- W3157651102 cites W2886871786 @default.
- W3157651102 cites W2887590219 @default.
- W3157651102 cites W2890815404 @default.
- W3157651102 cites W2948947170 @default.
- W3157651102 cites W2950784811 @default.
- W3157651102 cites W2999941287 @default.
- W3157651102 cites W3046205455 @default.
- W3157651102 cites W3087538850 @default.
- W3157651102 cites W3096912371 @default.
- W3157651102 cites W3097109903 @default.
- W3157651102 cites W3097533615 @default.
- W3157651102 cites W3098960064 @default.
- W3157651102 cites W3101650868 @default.
- W3157651102 cites W3118485687 @default.
- W3157651102 cites W3211132331 @default.
- W3157651102 cites W4240149198 @default.
- W3157651102 cites W4285719527 @default.
- W3157651102 cites W4293569649 @default.
- W3157651102 doi "https://doi.org/10.3389/fnagi.2021.635945" @default.
- W3157651102 hasPubMedCentralId "https://www.ncbi.nlm.nih.gov/pmc/articles/8110916" @default.
- W3157651102 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/33986655" @default.
- W3157651102 hasPublicationYear "2021" @default.
- W3157651102 type Work @default.
- W3157651102 sameAs 3157651102 @default.
- W3157651102 citedByCount "30" @default.
- W3157651102 countsByYear W31576511022021 @default.
- W3157651102 countsByYear W31576511022022 @default.
- W3157651102 countsByYear W31576511022023 @default.
- W3157651102 crossrefType "journal-article" @default.
- W3157651102 hasAuthorship W3157651102A5010008842 @default.
- W3157651102 hasAuthorship W3157651102A5050825493 @default.
- W3157651102 hasAuthorship W3157651102A5055053084 @default.
- W3157651102 hasAuthorship W3157651102A5056256317 @default.
- W3157651102 hasAuthorship W3157651102A5079735833 @default.
- W3157651102 hasBestOaLocation W31576511021 @default.
- W3157651102 hasConcept C111919701 @default.
- W3157651102 hasConcept C118505674 @default.
- W3157651102 hasConcept C119857082 @default.
- W3157651102 hasConcept C137293760 @default.
- W3157651102 hasConcept C150899416 @default.
- W3157651102 hasConcept C154945302 @default.
- W3157651102 hasConcept C204321447 @default.
- W3157651102 hasConcept C28490314 @default.
- W3157651102 hasConcept C41008148 @default.
- W3157651102 hasConcept C61328038 @default.
- W3157651102 hasConceptScore W3157651102C111919701 @default.
- W3157651102 hasConceptScore W3157651102C118505674 @default.
- W3157651102 hasConceptScore W3157651102C119857082 @default.
- W3157651102 hasConceptScore W3157651102C137293760 @default.
- W3157651102 hasConceptScore W3157651102C150899416 @default.
- W3157651102 hasConceptScore W3157651102C154945302 @default.
- W3157651102 hasConceptScore W3157651102C204321447 @default.
- W3157651102 hasConceptScore W3157651102C28490314 @default.
- W3157651102 hasConceptScore W3157651102C41008148 @default.
- W3157651102 hasConceptScore W3157651102C61328038 @default.
- W3157651102 hasLocation W31576511021 @default.
- W3157651102 hasLocation W31576511022 @default.
- W3157651102 hasLocation W31576511023 @default.
- W3157651102 hasLocation W31576511024 @default.
- W3157651102 hasOpenAccess W3157651102 @default.
- W3157651102 hasPrimaryLocation W31576511021 @default.
- W3157651102 hasRelatedWork W2359001871 @default.
- W3157651102 hasRelatedWork W2547835662 @default.
- W3157651102 hasRelatedWork W2960456850 @default.
- W3157651102 hasRelatedWork W3021430260 @default.
- W3157651102 hasRelatedWork W4281645081 @default.
- W3157651102 hasRelatedWork W4308262314 @default.
- W3157651102 hasRelatedWork W4312200629 @default.
- W3157651102 hasRelatedWork W4317565044 @default.
- W3157651102 hasRelatedWork W4382286161 @default.