Matches in SemOpenAlex for { <https://semopenalex.org/work/W4200183793> ?p ?o ?g. }
- W4200183793 endingPage "3559" @default.
- W4200183793 startingPage "3550" @default.
- W4200183793 abstract "Abstract Aims The purpose of this study was to construct a model for oral assessment using deep learning image recognition technology and to verify its accuracy. Background The effects of oral care on older people are significant, and the Oral Assessment Guide has been used internationally as an effective oral assessment tool in clinical practice. However, additional training, education, development of user manuals and continuous support from a dental hygienist are needed to improve the inter‐rater reliability of the Oral Assessment Guide. Design A retrospective observational study. Methods A total of 3,201 oral images of 114 older people aged >65 years were collected from five dental‐related facilities. These images were divided into six categories (lips, tongue, saliva, mucosa, gingiva, and teeth or dentures) that were evaluated by images, out of the total eight items that comprise components of the Oral Assessment Guide. Each item was classified into a rating of 1, 2 or 3. A convolutional neural network, which is a deep learning method used for image recognition, was used to construct the image recognition model. The study methods comply with the STROBE checklist. Results We constructed models with a classification accuracy of 98.8% for lips, 94.3% for tongue, 92.8% for saliva, 78.6% for mucous membranes, 93.0% for gingiva and 93.6% for teeth or dentures. Conclusions Highly accurate diagnostic imaging models using convolutional neural networks were constructed for six items of the Oral Assessment Guide and validated. In particular, for the five items of lips, tongue, saliva, gingiva, and teeth or dentures, models with a high accuracy of over 90% were obtained. Relevance to Clinical Practice The model built in this study has the potential to contribute to obtain reproducibility and reliability of the ratings, to shorten the time for assessment, to collaborate with dental professionals and to be used as an educational tool." @default.
- W4200183793 created "2021-12-31" @default.
- W4200183793 creator A5000283915 @default.
- W4200183793 creator A5003937972 @default.
- W4200183793 creator A5034710165 @default.
- W4200183793 creator A5037282127 @default.
- W4200183793 creator A5040880315 @default.
- W4200183793 creator A5045145715 @default.
- W4200183793 creator A5053778755 @default.
- W4200183793 creator A5055228579 @default.
- W4200183793 creator A5071257265 @default.
- W4200183793 date "2021-12-21" @default.
- W4200183793 modified "2023-09-26" @default.
- W4200183793 title "Image diagnosis models for the oral assessment of older people using convolutional neural networks: A retrospective observational study" @default.
- W4200183793 cites W1738401185 @default.
- W4200183793 cites W1912327761 @default.
- W4200183793 cites W1967921770 @default.
- W4200183793 cites W2047940724 @default.
- W4200183793 cites W2085115066 @default.
- W4200183793 cites W2101049134 @default.
- W4200183793 cites W2101294968 @default.
- W4200183793 cites W2145749429 @default.
- W4200183793 cites W2165225901 @default.
- W4200183793 cites W2581082771 @default.
- W4200183793 cites W2618530766 @default.
- W4200183793 cites W2806853752 @default.
- W4200183793 cites W2883741661 @default.
- W4200183793 cites W2887051827 @default.
- W4200183793 cites W2895755025 @default.
- W4200183793 cites W2901330522 @default.
- W4200183793 cites W2919115771 @default.
- W4200183793 cites W2934073222 @default.
- W4200183793 cites W2990243431 @default.
- W4200183793 cites W3011769292 @default.
- W4200183793 cites W4200183793 @default.
- W4200183793 doi "https://doi.org/10.1111/jocn.16182" @default.
- W4200183793 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/34935230" @default.
- W4200183793 hasPublicationYear "2021" @default.
- W4200183793 type Work @default.
- W4200183793 citedByCount "2" @default.
- W4200183793 countsByYear W42001837932021 @default.
- W4200183793 countsByYear W42001837932023 @default.
- W4200183793 crossrefType "journal-article" @default.
- W4200183793 hasAuthorship W4200183793A5000283915 @default.
- W4200183793 hasAuthorship W4200183793A5003937972 @default.
- W4200183793 hasAuthorship W4200183793A5034710165 @default.
- W4200183793 hasAuthorship W4200183793A5037282127 @default.
- W4200183793 hasAuthorship W4200183793A5040880315 @default.
- W4200183793 hasAuthorship W4200183793A5045145715 @default.
- W4200183793 hasAuthorship W4200183793A5053778755 @default.
- W4200183793 hasAuthorship W4200183793A5055228579 @default.
- W4200183793 hasAuthorship W4200183793A5071257265 @default.
- W4200183793 hasBestOaLocation W42001837931 @default.
- W4200183793 hasConcept C108583219 @default.
- W4200183793 hasConcept C121332964 @default.
- W4200183793 hasConcept C142724271 @default.
- W4200183793 hasConcept C154945302 @default.
- W4200183793 hasConcept C15744967 @default.
- W4200183793 hasConcept C163258240 @default.
- W4200183793 hasConcept C180747234 @default.
- W4200183793 hasConcept C199343813 @default.
- W4200183793 hasConcept C23131810 @default.
- W4200183793 hasConcept C2779356329 @default.
- W4200183793 hasConcept C2779744641 @default.
- W4200183793 hasConcept C2780062004 @default.
- W4200183793 hasConcept C29694066 @default.
- W4200183793 hasConcept C41008148 @default.
- W4200183793 hasConcept C43214815 @default.
- W4200183793 hasConcept C62520636 @default.
- W4200183793 hasConcept C71924100 @default.
- W4200183793 hasConcept C81363708 @default.
- W4200183793 hasConceptScore W4200183793C108583219 @default.
- W4200183793 hasConceptScore W4200183793C121332964 @default.
- W4200183793 hasConceptScore W4200183793C142724271 @default.
- W4200183793 hasConceptScore W4200183793C154945302 @default.
- W4200183793 hasConceptScore W4200183793C15744967 @default.
- W4200183793 hasConceptScore W4200183793C163258240 @default.
- W4200183793 hasConceptScore W4200183793C180747234 @default.
- W4200183793 hasConceptScore W4200183793C199343813 @default.
- W4200183793 hasConceptScore W4200183793C23131810 @default.
- W4200183793 hasConceptScore W4200183793C2779356329 @default.
- W4200183793 hasConceptScore W4200183793C2779744641 @default.
- W4200183793 hasConceptScore W4200183793C2780062004 @default.
- W4200183793 hasConceptScore W4200183793C29694066 @default.
- W4200183793 hasConceptScore W4200183793C41008148 @default.
- W4200183793 hasConceptScore W4200183793C43214815 @default.
- W4200183793 hasConceptScore W4200183793C62520636 @default.
- W4200183793 hasConceptScore W4200183793C71924100 @default.
- W4200183793 hasConceptScore W4200183793C81363708 @default.
- W4200183793 hasIssue "23-24" @default.
- W4200183793 hasLocation W42001837931 @default.
- W4200183793 hasLocation W42001837932 @default.
- W4200183793 hasLocation W42001837933 @default.
- W4200183793 hasOpenAccess W4200183793 @default.
- W4200183793 hasPrimaryLocation W42001837931 @default.
- W4200183793 hasRelatedWork W2731899572 @default.
- W4200183793 hasRelatedWork W2999805992 @default.
- W4200183793 hasRelatedWork W3011074480 @default.