Matches in SemOpenAlex for { <https://semopenalex.org/work/W2973140425> ?p ?o ?g. }
- W2973140425 endingPage "1881.e2" @default.
- W2973140425 startingPage "1874" @default.
- W2973140425 abstract "Background & AimsPrecise optical diagnosis of colorectal polyps could improve the cost-effectiveness of colonoscopy and reduce polypectomy-related complications. However, it is difficult for community-based non-experts to obtain sufficient diagnostic performance. Artificial intelligence-based systems have been developed to analyze endoscopic images; they identify neoplasms with high accuracy and low interobserver variation. We performed a multi-center study to determine the diagnostic accuracy of EndoBRAIN, an artificial intelligence-based system that analyzes cell nuclei, crypt structure, and microvessels in endoscopic images, in identification of colon neoplasms.MethodsThe EndoBRAIN system was initially trained using 69,142 endocytoscopic images, taken at 520-fold magnification, from patients with colorectal polyps who underwent endoscopy at 5 academic centers in Japan from October 2017 through March 2018. We performed a retrospective comparative analysis of the diagnostic performance of EndoBRAIN vs that of 30 endoscopists (20 trainees and 10 experts); the endoscopists assessed images from 100 cases produced via white-light microscopy, endocytoscopy with methylene blue staining, and endocytoscopy with narrow-band imaging. EndoBRAIN was used to assess endocytoscopic, but not white-light, images. The primary outcome was the accuracy of EndoBrain in distinguishing neoplasms from non-neoplasms, compared with that of endoscopists, using findings from pathology analysis as the reference standard.ResultsIn analysis of stained endocytoscopic images, EndoBRAIN identified colon lesions with 96.9% sensitivity (95% CI, 95.8%–97.8%), 100% specificity (95% CI, 99.6%–100%), 98% accuracy (95% CI, 97.3%–98.6%), a 100% positive-predictive value (95% CI, 99.8%–100%), and a 94.6% negative-predictive (95% CI, 92.7%–96.1%); these values were all significantly greater than those of the endoscopy trainees and experts. In analysis of narrow-band images, EndoBRAIN distinguished neoplastic from non-neoplastic lesions with 96.9% sensitivity (95% CI, 95.8–97.8), 94.3% specificity (95% CI, 92.3–95.9), 96.0% accuracy (95% CI, 95.1–96.8), a 96.9% positive-predictive value, (95% CI, 95.8–97.8), and a 94.3% negative-predictive value (95% CI, 92.3–95.9); these values were all significantly higher than those of the endoscopy trainees, sensitivity and negative-predictive value were significantly higher but the other values are comparable to those of the experts.ConclusionsEndoBRAIN accurately differentiated neoplastic from non-neoplastic lesions in stained endocytoscopic images and endocytoscopic narrow-band images, when pathology findings were used as the standard. This technology has been authorized for clinical use by the Japanese regulatory agency and should be used in endoscopic evaluation of small polyps more widespread clinical settings. UMIN clinical trial no: UMIN000028843. Precise optical diagnosis of colorectal polyps could improve the cost-effectiveness of colonoscopy and reduce polypectomy-related complications. However, it is difficult for community-based non-experts to obtain sufficient diagnostic performance. Artificial intelligence-based systems have been developed to analyze endoscopic images; they identify neoplasms with high accuracy and low interobserver variation. We performed a multi-center study to determine the diagnostic accuracy of EndoBRAIN, an artificial intelligence-based system that analyzes cell nuclei, crypt structure, and microvessels in endoscopic images, in identification of colon neoplasms. The EndoBRAIN system was initially trained using 69,142 endocytoscopic images, taken at 520-fold magnification, from patients with colorectal polyps who underwent endoscopy at 5 academic centers in Japan from October 2017 through March 2018. We performed a retrospective comparative analysis of the diagnostic performance of EndoBRAIN vs that of 30 endoscopists (20 trainees and 10 experts); the endoscopists assessed images from 100 cases produced via white-light microscopy, endocytoscopy with methylene blue staining, and endocytoscopy with narrow-band imaging. EndoBRAIN was used to assess endocytoscopic, but not white-light, images. The primary outcome was the accuracy of EndoBrain in distinguishing neoplasms from non-neoplasms, compared with that of endoscopists, using findings from pathology analysis as the reference standard. In analysis of stained endocytoscopic images, EndoBRAIN identified colon lesions with 96.9% sensitivity (95% CI, 95.8%–97.8%), 100% specificity (95% CI, 99.6%–100%), 98% accuracy (95% CI, 97.3%–98.6%), a 100% positive-predictive value (95% CI, 99.8%–100%), and a 94.6% negative-predictive (95% CI, 92.7%–96.1%); these values were all significantly greater than those of the endoscopy trainees and experts. In analysis of narrow-band images, EndoBRAIN distinguished neoplastic from non-neoplastic lesions with 96.9% sensitivity (95% CI, 95.8–97.8), 94.3% specificity (95% CI, 92.3–95.9), 96.0% accuracy (95% CI, 95.1–96.8), a 96.9% positive-predictive value, (95% CI, 95.8–97.8), and a 94.3% negative-predictive value (95% CI, 92.3–95.9); these values were all significantly higher than those of the endoscopy trainees, sensitivity and negative-predictive value were significantly higher but the other values are comparable to those of the experts. EndoBRAIN accurately differentiated neoplastic from non-neoplastic lesions in stained endocytoscopic images and endocytoscopic narrow-band images, when pathology findings were used as the standard. This technology has been authorized for clinical use by the Japanese regulatory agency and should be used in endoscopic evaluation of small polyps more widespread clinical settings. UMIN clinical trial no: UMIN000028843." @default.
- W2973140425 created "2019-09-19" @default.
- W2973140425 creator A5004080101 @default.
- W2973140425 creator A5006451525 @default.
- W2973140425 creator A5012400291 @default.
- W2973140425 creator A5014615513 @default.
- W2973140425 creator A5027193018 @default.
- W2973140425 creator A5032527419 @default.
- W2973140425 creator A5042118446 @default.
- W2973140425 creator A5043651016 @default.
- W2973140425 creator A5045380541 @default.
- W2973140425 creator A5050675673 @default.
- W2973140425 creator A5054095765 @default.
- W2973140425 creator A5063329475 @default.
- W2973140425 creator A5064388156 @default.
- W2973140425 creator A5067306432 @default.
- W2973140425 creator A5067713214 @default.
- W2973140425 creator A5072538244 @default.
- W2973140425 creator A5072568857 @default.
- W2973140425 creator A5074920808 @default.
- W2973140425 creator A5083917443 @default.
- W2973140425 creator A5085353218 @default.
- W2973140425 creator A5086595860 @default.
- W2973140425 date "2020-07-01" @default.
- W2973140425 modified "2023-10-16" @default.
- W2973140425 title "Artificial Intelligence-assisted System Improves Endoscopic Identification of Colorectal Neoplasms" @default.
- W2973140425 cites W1263185392 @default.
- W2973140425 cites W1548813211 @default.
- W2973140425 cites W1777999736 @default.
- W2973140425 cites W1966210936 @default.
- W2973140425 cites W1979185641 @default.
- W2973140425 cites W1995231043 @default.
- W2973140425 cites W2005464795 @default.
- W2973140425 cites W2017327817 @default.
- W2973140425 cites W2019397787 @default.
- W2973140425 cites W2044465660 @default.
- W2973140425 cites W2047145460 @default.
- W2973140425 cites W2075246611 @default.
- W2973140425 cites W2084846067 @default.
- W2973140425 cites W2100304141 @default.
- W2973140425 cites W2114282318 @default.
- W2973140425 cites W2148401964 @default.
- W2973140425 cites W2150763283 @default.
- W2973140425 cites W2154469952 @default.
- W2973140425 cites W2162106299 @default.
- W2973140425 cites W2323033112 @default.
- W2973140425 cites W2326063701 @default.
- W2973140425 cites W2338997131 @default.
- W2973140425 cites W2539426315 @default.
- W2973140425 cites W2593790801 @default.
- W2973140425 cites W2610398104 @default.
- W2973140425 cites W2724276442 @default.
- W2973140425 cites W2772246530 @default.
- W2973140425 cites W2804428553 @default.
- W2973140425 cites W2808622206 @default.
- W2973140425 cites W2887719255 @default.
- W2973140425 cites W4239510810 @default.
- W2973140425 cites W4248654028 @default.
- W2973140425 doi "https://doi.org/10.1016/j.cgh.2019.09.009" @default.
- W2973140425 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/31525512" @default.
- W2973140425 hasPublicationYear "2020" @default.
- W2973140425 type Work @default.
- W2973140425 sameAs 2973140425 @default.
- W2973140425 citedByCount "138" @default.
- W2973140425 countsByYear W29731404252020 @default.
- W2973140425 countsByYear W29731404252021 @default.
- W2973140425 countsByYear W29731404252022 @default.
- W2973140425 countsByYear W29731404252023 @default.
- W2973140425 crossrefType "journal-article" @default.
- W2973140425 hasAuthorship W2973140425A5004080101 @default.
- W2973140425 hasAuthorship W2973140425A5006451525 @default.
- W2973140425 hasAuthorship W2973140425A5012400291 @default.
- W2973140425 hasAuthorship W2973140425A5014615513 @default.
- W2973140425 hasAuthorship W2973140425A5027193018 @default.
- W2973140425 hasAuthorship W2973140425A5032527419 @default.
- W2973140425 hasAuthorship W2973140425A5042118446 @default.
- W2973140425 hasAuthorship W2973140425A5043651016 @default.
- W2973140425 hasAuthorship W2973140425A5045380541 @default.
- W2973140425 hasAuthorship W2973140425A5050675673 @default.
- W2973140425 hasAuthorship W2973140425A5054095765 @default.
- W2973140425 hasAuthorship W2973140425A5063329475 @default.
- W2973140425 hasAuthorship W2973140425A5064388156 @default.
- W2973140425 hasAuthorship W2973140425A5067306432 @default.
- W2973140425 hasAuthorship W2973140425A5067713214 @default.
- W2973140425 hasAuthorship W2973140425A5072538244 @default.
- W2973140425 hasAuthorship W2973140425A5072568857 @default.
- W2973140425 hasAuthorship W2973140425A5074920808 @default.
- W2973140425 hasAuthorship W2973140425A5083917443 @default.
- W2973140425 hasAuthorship W2973140425A5085353218 @default.
- W2973140425 hasAuthorship W2973140425A5086595860 @default.
- W2973140425 hasConcept C121608353 @default.
- W2973140425 hasConcept C126322002 @default.
- W2973140425 hasConcept C126838900 @default.
- W2973140425 hasConcept C142724271 @default.
- W2973140425 hasConcept C154945302 @default.
- W2973140425 hasConcept C198433322 @default.
- W2973140425 hasConcept C2778435480 @default.
- W2973140425 hasConcept C2778451229 @default.