Matches in SemOpenAlex for { <https://semopenalex.org/work/W4285035180> ?p ?o ?g. }
Showing items 1 to 89 of
89
with 100 items per page.
- W4285035180 endingPage "215" @default.
- W4285035180 startingPage "209" @default.
- W4285035180 abstract "Uroflowmetry remains an important tool for the assessment of patients with lower urinary tract symptoms (LUTS), but accuracy can be limited by within-subject variation of urinary flow rates. Voiding acoustics appear to correlate well with conventional uroflowmetry and show promise as a convenient home-based alternative for the monitoring of urinary flows.To evaluate the ability of a sound-based deep learning algorithm (Audioflow) to predict uroflowmetry parameters and identify abnormal urinary flow patterns.In this prospective open-label study, 534 male participants recruited at Singapore General Hospital between December 1, 2017 and July 1, 2019 voided into a uroflowmetry machine, and voiding acoustics were recorded using a smartphone in close proximity. The Audioflow algorithm consisted of two models-the first model for the prediction of flow parameters including maximum flow rate (Qmax), average flow rate (Qave), and voided volume (VV) was trained and validated using leave-one-out cross-validation procedures; the second model for discrimination of normal and abnormal urinary flows was trained based on a reference standard created by three senior urologists.Lin's correlation coefficient was used to evaluate the agreement between Audioflow predictions and conventional uroflowmetry for Qmax, Qave, and VV. Accuracy of the Audioflow algorithm in the identification of abnormal urinary flows was assessed with sensitivity analyses and the area under the receiver operating curve (AUC); this algorithm was compared with an external panel of graders comprising six urology residents/general practitioners who separately graded flow patterns in the validation dataset.A total of 331 patients were included for analysis. Agreement between Audioflow and conventional uroflowmetry for Qmax, Qave, and VV was 0.77 (95% confidence interval [CI], 0.72-0.80), 0.85 (95% CI, 0.82-0.88) and 0.84 (95% CI, 0.80-0.87), respectively. For the identification of abnormal flows, Audioflow achieved a high rate of agreement of 83.8% (95% CI, 77.5-90.1%) with the reference standard, and was comparable with an external panel of six residents/general practitioners. AUC was 0.892 (95% CI, 0.834-0.951), with high sensitivity of 87.3% (95% CI, 76.8-93.7%) and specificity of 77.5% (95% CI, 61.1-88.6%).The results of this study suggest that a deep learning algorithm can predict uroflowmetry parameters and identify abnormal urinary voids based on voiding sounds, and shows promise as a simple home-based alternative to uroflowmetry in the management of patients with LUTS.In this study, we trained a deep learning-based algorithm to measure urinary flow rates and identify abnormal flow patterns based on voiding sounds. This may provide a convenient, home-based alternative to conventional uroflowmetry for the assessment and monitoring of patients with lower urinary tract symptoms." @default.
- W4285035180 created "2022-07-12" @default.
- W4285035180 creator A5000421149 @default.
- W4285035180 creator A5006273825 @default.
- W4285035180 creator A5008871181 @default.
- W4285035180 creator A5043042759 @default.
- W4285035180 creator A5047780053 @default.
- W4285035180 creator A5056685209 @default.
- W4285035180 creator A5061649813 @default.
- W4285035180 creator A5061973743 @default.
- W4285035180 creator A5074019528 @default.
- W4285035180 creator A5067117872 @default.
- W4285035180 date "2023-01-01" @default.
- W4285035180 modified "2023-10-09" @default.
- W4285035180 title "Development and Validation of a Deep Learning System for Sound-based Prediction of Urinary Flow" @default.
- W4285035180 cites W1497931572 @default.
- W4285035180 cites W1575778540 @default.
- W4285035180 cites W1737412359 @default.
- W4285035180 cites W1967312126 @default.
- W4285035180 cites W2014136566 @default.
- W4285035180 cites W2033211931 @default.
- W4285035180 cites W2051893541 @default.
- W4285035180 cites W2092025896 @default.
- W4285035180 cites W2313339984 @default.
- W4285035180 cites W2333565139 @default.
- W4285035180 cites W2411227207 @default.
- W4285035180 cites W2421274932 @default.
- W4285035180 cites W2559758118 @default.
- W4285035180 cites W3043453246 @default.
- W4285035180 cites W3106691654 @default.
- W4285035180 cites W3120013101 @default.
- W4285035180 cites W3193766449 @default.
- W4285035180 doi "https://doi.org/10.1016/j.euf.2022.06.011" @default.
- W4285035180 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/35835694" @default.
- W4285035180 hasPublicationYear "2023" @default.
- W4285035180 type Work @default.
- W4285035180 citedByCount "0" @default.
- W4285035180 crossrefType "journal-article" @default.
- W4285035180 hasAuthorship W4285035180A5000421149 @default.
- W4285035180 hasAuthorship W4285035180A5006273825 @default.
- W4285035180 hasAuthorship W4285035180A5008871181 @default.
- W4285035180 hasAuthorship W4285035180A5043042759 @default.
- W4285035180 hasAuthorship W4285035180A5047780053 @default.
- W4285035180 hasAuthorship W4285035180A5056685209 @default.
- W4285035180 hasAuthorship W4285035180A5061649813 @default.
- W4285035180 hasAuthorship W4285035180A5061973743 @default.
- W4285035180 hasAuthorship W4285035180A5067117872 @default.
- W4285035180 hasAuthorship W4285035180A5074019528 @default.
- W4285035180 hasConcept C121608353 @default.
- W4285035180 hasConcept C126322002 @default.
- W4285035180 hasConcept C141071460 @default.
- W4285035180 hasConcept C188816634 @default.
- W4285035180 hasConcept C2776235491 @default.
- W4285035180 hasConcept C2910170020 @default.
- W4285035180 hasConcept C58471807 @default.
- W4285035180 hasConcept C71924100 @default.
- W4285035180 hasConcept C76318530 @default.
- W4285035180 hasConcept C77411442 @default.
- W4285035180 hasConceptScore W4285035180C121608353 @default.
- W4285035180 hasConceptScore W4285035180C126322002 @default.
- W4285035180 hasConceptScore W4285035180C141071460 @default.
- W4285035180 hasConceptScore W4285035180C188816634 @default.
- W4285035180 hasConceptScore W4285035180C2776235491 @default.
- W4285035180 hasConceptScore W4285035180C2910170020 @default.
- W4285035180 hasConceptScore W4285035180C58471807 @default.
- W4285035180 hasConceptScore W4285035180C71924100 @default.
- W4285035180 hasConceptScore W4285035180C76318530 @default.
- W4285035180 hasConceptScore W4285035180C77411442 @default.
- W4285035180 hasIssue "1" @default.
- W4285035180 hasLocation W42850351801 @default.
- W4285035180 hasLocation W42850351802 @default.
- W4285035180 hasOpenAccess W4285035180 @default.
- W4285035180 hasPrimaryLocation W42850351801 @default.
- W4285035180 hasRelatedWork W2008911041 @default.
- W4285035180 hasRelatedWork W2051966709 @default.
- W4285035180 hasRelatedWork W2373132269 @default.
- W4285035180 hasRelatedWork W2390926424 @default.
- W4285035180 hasRelatedWork W2392162854 @default.
- W4285035180 hasRelatedWork W2565613466 @default.
- W4285035180 hasRelatedWork W2744023513 @default.
- W4285035180 hasRelatedWork W2949466702 @default.
- W4285035180 hasRelatedWork W4237394904 @default.
- W4285035180 hasRelatedWork W3030915957 @default.
- W4285035180 hasVolume "9" @default.
- W4285035180 isParatext "false" @default.
- W4285035180 isRetracted "false" @default.
- W4285035180 workType "article" @default.