Matches in SemOpenAlex for { <https://semopenalex.org/work/W3207640269> ?p ?o ?g. }
- W3207640269 abstract "Abstract Background Changes in speech, language, and episodic and semantic memory are documented in Alzheimer’s disease (AD) years before routine diagnosis. Aims Develop an Artificial Intelligence (AI) system detecting amyloid-confirmed prodromal and preclinical AD from speech collected remotely via participants’ smartphones. Method A convenience sample of 133 participants with established amyloid beta and clinical diagnostic status (66 A β +, 67 A β -; 71 cognitively unimpaired (CU), 62 with mild cognitive impairment (MCI) or mild AD) completed clinical assessments for the AMYPRED study ( NCT04828122 ). Participants completed optional remote assessments daily for 7-8 days, including the Automatic Story Recall Task (ASRT), a story recall paradigm with short and long variants, and immediate and delayed recall phases. Vector-based representations from each story source and transcribed retelling were produced using ParaBLEU, a paraphrase evaluation model. Representations were fed into logistic regression models trained with tournament leave-pair-out cross-validation analysis, predicting A β status and MCI/mild AD within the full sample and A β status in clinical diagnostic subsamples. Findings At least one full remote ASRT assessment was completed by 115 participants (mean age=69.6 (range 54-80); 63 female/52 male; 66 CU and 49 MCI/mild AD, 56 A β + and 59 A β -). Using an average of 2.7 minutes of automatically transcribed speech from immediate recall of short stories, the AI system predicted MCI/mild AD in the full sample (AUC=0.85 +/- 0.08), and amyloid in MCI/mild AD (AUC=0.73 +/- 0.14) and CU subsamples (AUC=0.71 +/- 0.13). Amyloid classification within the full sample was no better than chance (AUC=0.57 +/- 0.11). Broadly similar results were reported for manually transcribed data, long ASRTs and delayed recall. Interpretation Combined with advanced AI language models, brief, remote speech-based testing offers simple, accessible and cost-effective screening for early stage AD. Funding Novoic. Research in context Evidence before this study Recent systematic reviews have examined the use of speech data to detect vocal and linguistic changes taking place in Alzheimer’s dementia. Most of this research has been completed in the DementiaBank cohort, where subjects are usually in the (more progressed) dementia stages and without biomarker confirmation of Alzheimer’s disease (AD). Whether speech assessment can be used in a biomarker-confirmed, early stage (preclinical and prodromal) AD population has not yet been tested. Most prior work has relied on extracting manually defined “features”, e.g. the noun rate, which has too low a predictive value to offer clinical utility in an early stage AD population. In recent years, audio- and text-based machine learning models have improved significantly and a few studies have used such models in the context of classifying AD dementia. These approaches could offer greater sensitivity but it remains to be seen how well they work in a biomarker-confirmed, early stage AD population. Most studies have relied on controlled research settings and on manually transcribing speech before analysis, both of which limit broader applicability and use in clinical practice. Added value of this study This study tests the feasibility of advanced speech analysis for clinical testing of early stage AD. We present the results from a cross-sectional sample in the UK examining the predictive ability of fully automated speech-based testing in biomarker-confirmed early stage Alzheimer’s disease. We use a novel artificial intelligence (AI) system, which delivers sensitive indicators of AD-at-risk or subtle cognitive impairment. The AI system differentiates amyloid beta positive and amyloid beta negative subjects, and subjects with mild cognitive impairment (MCI) or mild AD from cognitively healthy subjects. Importantly the system is fully remote and self-contained: participants’ own devices are used for test administration and speech capture. Transcription and analyses are automated, with limited signal loss. Overall the results support the real-world applicability of speech-based assessment to detect early stage Alzheimer’s disease. While a number of medical devices have recently been approved using image-based AI algorithms, the present research is the first to demonstrate the use case and promise of speech-based AI systems for clinical practice. Implications of all the available evidence Prior research has shown compelling evidence of speech- and language-based changes occurring in more progressed stages of Alzheimer’s disease. Our study builds on this early work to show the clinical utility and feasibility of speech-based AI systems for the detection of Alzheimer’s disease in its earliest stages. Our work, using advanced AI systems, shows sensitivity to a biomarker-confirmed early stage AD population. Speech data can be collected with self-administered assessments completed in a real world setting, and analysed automatically. With the first treatment for AD entering the market, there is an urgent need for scalable, affordable, convenient and accessible testing to screen at-risk subject candidates for biomarker assessment and early cognitive impairment. Sensitive speech-based biomarkers may help to fulfil this unmet need." @default.
- W3207640269 created "2021-10-25" @default.
- W3207640269 creator A5003046428 @default.
- W3207640269 creator A5003904755 @default.
- W3207640269 creator A5020179909 @default.
- W3207640269 creator A5038524461 @default.
- W3207640269 creator A5051934606 @default.
- W3207640269 creator A5062112119 @default.
- W3207640269 creator A5081571633 @default.
- W3207640269 creator A5082228880 @default.
- W3207640269 date "2021-10-20" @default.
- W3207640269 modified "2023-10-17" @default.
- W3207640269 title "Evaluation of a speech-based AI system for early detection of Alzheimer’s disease remotely via smartphones" @default.
- W3207640269 cites W1847168837 @default.
- W3207640269 cites W1991952617 @default.
- W3207640269 cites W2005918389 @default.
- W3207640269 cites W2042571564 @default.
- W3207640269 cites W2058269934 @default.
- W3207640269 cites W2134711807 @default.
- W3207640269 cites W2136052751 @default.
- W3207640269 cites W2136914353 @default.
- W3207640269 cites W2168357345 @default.
- W3207640269 cites W2317946538 @default.
- W3207640269 cites W2408352576 @default.
- W3207640269 cites W2533173333 @default.
- W3207640269 cites W2551096483 @default.
- W3207640269 cites W2582524520 @default.
- W3207640269 cites W2767453481 @default.
- W3207640269 cites W2780732598 @default.
- W3207640269 cites W3046227178 @default.
- W3207640269 cites W3046275966 @default.
- W3207640269 cites W3047566951 @default.
- W3207640269 cites W3101650868 @default.
- W3207640269 cites W3106991714 @default.
- W3207640269 cites W3127106548 @default.
- W3207640269 cites W3137358529 @default.
- W3207640269 cites W3159530258 @default.
- W3207640269 cites W3185554154 @default.
- W3207640269 cites W3206396985 @default.
- W3207640269 doi "https://doi.org/10.1101/2021.10.19.21264878" @default.
- W3207640269 hasPublicationYear "2021" @default.
- W3207640269 type Work @default.
- W3207640269 sameAs 3207640269 @default.
- W3207640269 citedByCount "2" @default.
- W3207640269 countsByYear W32076402692022 @default.
- W3207640269 countsByYear W32076402692023 @default.
- W3207640269 crossrefType "posted-content" @default.
- W3207640269 hasAuthorship W3207640269A5003046428 @default.
- W3207640269 hasAuthorship W3207640269A5003904755 @default.
- W3207640269 hasAuthorship W3207640269A5020179909 @default.
- W3207640269 hasAuthorship W3207640269A5038524461 @default.
- W3207640269 hasAuthorship W3207640269A5051934606 @default.
- W3207640269 hasAuthorship W3207640269A5062112119 @default.
- W3207640269 hasAuthorship W3207640269A5081571633 @default.
- W3207640269 hasAuthorship W3207640269A5082228880 @default.
- W3207640269 hasBestOaLocation W32076402691 @default.
- W3207640269 hasConcept C100660578 @default.
- W3207640269 hasConcept C118552586 @default.
- W3207640269 hasConcept C126322002 @default.
- W3207640269 hasConcept C151956035 @default.
- W3207640269 hasConcept C15744967 @default.
- W3207640269 hasConcept C162324750 @default.
- W3207640269 hasConcept C169900460 @default.
- W3207640269 hasConcept C180747234 @default.
- W3207640269 hasConcept C187736073 @default.
- W3207640269 hasConcept C2779134260 @default.
- W3207640269 hasConcept C2780451532 @default.
- W3207640269 hasConcept C2984915365 @default.
- W3207640269 hasConcept C548259974 @default.
- W3207640269 hasConcept C71924100 @default.
- W3207640269 hasConcept C88576662 @default.
- W3207640269 hasConceptScore W3207640269C100660578 @default.
- W3207640269 hasConceptScore W3207640269C118552586 @default.
- W3207640269 hasConceptScore W3207640269C126322002 @default.
- W3207640269 hasConceptScore W3207640269C151956035 @default.
- W3207640269 hasConceptScore W3207640269C15744967 @default.
- W3207640269 hasConceptScore W3207640269C162324750 @default.
- W3207640269 hasConceptScore W3207640269C169900460 @default.
- W3207640269 hasConceptScore W3207640269C180747234 @default.
- W3207640269 hasConceptScore W3207640269C187736073 @default.
- W3207640269 hasConceptScore W3207640269C2779134260 @default.
- W3207640269 hasConceptScore W3207640269C2780451532 @default.
- W3207640269 hasConceptScore W3207640269C2984915365 @default.
- W3207640269 hasConceptScore W3207640269C548259974 @default.
- W3207640269 hasConceptScore W3207640269C71924100 @default.
- W3207640269 hasConceptScore W3207640269C88576662 @default.
- W3207640269 hasLocation W32076402691 @default.
- W3207640269 hasOpenAccess W3207640269 @default.
- W3207640269 hasPrimaryLocation W32076402691 @default.
- W3207640269 hasRelatedWork W1972619773 @default.
- W3207640269 hasRelatedWork W1989444360 @default.
- W3207640269 hasRelatedWork W2040490650 @default.
- W3207640269 hasRelatedWork W2100720428 @default.
- W3207640269 hasRelatedWork W2139179616 @default.
- W3207640269 hasRelatedWork W2996163655 @default.
- W3207640269 hasRelatedWork W2998807021 @default.
- W3207640269 hasRelatedWork W3132635712 @default.
- W3207640269 hasRelatedWork W4307992689 @default.
- W3207640269 hasRelatedWork W2291579000 @default.
- W3207640269 isParatext "false" @default.