Matches in SemOpenAlex for { <https://semopenalex.org/work/W4285390037> ?p ?o ?g. }
- W4285390037 endingPage "7149" @default.
- W4285390037 startingPage "7118" @default.
- W4285390037 abstract "Abstract Background Automatic segmentation of 3D objects in computed tomography (CT) is challenging. Current methods, based mainly on artificial intelligence (AI) and end‐to‐end deep learning (DL) networks, are weak in garnering high‐level anatomic information, which leads to compromised efficiency and robustness. This can be overcome by incorporating natural intelligence (NI) into AI methods via computational models of human anatomic knowledge. Purpose We formulate a hybrid intelligence (HI) approach that integrates the complementary strengths of NI and AI for organ segmentation in CT images and illustrate performance in the application of radiation therapy (RT) planning via multisite clinical evaluation. Methods The system employs five modules: (i) body region recognition, which automatically trims a given image to a precisely defined target body region; (ii) NI‐based automatic anatomy recognition object recognition (AAR‐R), which performs object recognition in the trimmed image without DL and outputs a localized fuzzy model for each object; (iii) DL‐based recognition (DL‐R), which refines the coarse recognition results of AAR‐R and outputs a stack of 2D bounding boxes (BBs) for each object; (iv) model morphing (MM), which deforms the AAR‐R fuzzy model of each object guided by the BBs output by DL‐R; and (v) DL‐based delineation (DL‐D), which employs the object containment information provided by MM to delineate each object. NI from (ii), AI from (i), (iii), and (v), and their combination from (iv) facilitate the HI system. Results The HI system was tested on 26 organs in neck and thorax body regions on CT images obtained prospectively from 464 patients in a study involving four RT centers. Data sets from one separate independent institution involving 125 patients were employed in training/model building for each of the two body regions, whereas 104 and 110 data sets from the 4 RT centers were utilized for testing on neck and thorax, respectively. In the testing data sets, 83% of the images had limitations such as streak artifacts, poor contrast, shape distortion, pathology, or implants. The contours output by the HI system were compared to contours drawn in clinical practice at the four RT centers by utilizing an independently established ground‐truth set of contours as reference. Three sets of measures were employed: accuracy via Dice coefficient (DC) and Hausdorff boundary distance (HD), subjective clinical acceptability via a blinded reader study, and efficiency by measuring human time saved in contouring by the HI system. Overall, the HI system achieved a mean DC of 0.78 and 0.87 and a mean HD of 2.22 and 4.53 mm for neck and thorax, respectively. It significantly outperformed clinical contouring in accuracy and saved overall 70% of human time over clinical contouring time, whereas acceptability scores varied significantly from site to site for both auto‐contours and clinically drawn contours. Conclusions The HI system is observed to behave like an expert human in robustness in the contouring task but vastly more efficiently. It seems to use NI help where image information alone will not suffice to decide, first for the correct localization of the object and then for the precise delineation of the boundary." @default.
- W4285390037 created "2022-07-14" @default.
- W4285390037 creator A5001336339 @default.
- W4285390037 creator A5009676183 @default.
- W4285390037 creator A5009745981 @default.
- W4285390037 creator A5009839688 @default.
- W4285390037 creator A5011837566 @default.
- W4285390037 creator A5015838477 @default.
- W4285390037 creator A5026804587 @default.
- W4285390037 creator A5032112612 @default.
- W4285390037 creator A5033308936 @default.
- W4285390037 creator A5035059310 @default.
- W4285390037 creator A5040756589 @default.
- W4285390037 creator A5050896415 @default.
- W4285390037 creator A5056584558 @default.
- W4285390037 creator A5057437696 @default.
- W4285390037 creator A5061994382 @default.
- W4285390037 creator A5063109122 @default.
- W4285390037 creator A5065713856 @default.
- W4285390037 creator A5067536094 @default.
- W4285390037 creator A5068836939 @default.
- W4285390037 creator A5069565044 @default.
- W4285390037 creator A5069851642 @default.
- W4285390037 creator A5072448404 @default.
- W4285390037 creator A5073343147 @default.
- W4285390037 creator A5074344378 @default.
- W4285390037 creator A5075757580 @default.
- W4285390037 creator A5078429603 @default.
- W4285390037 creator A5079431908 @default.
- W4285390037 creator A5080030813 @default.
- W4285390037 creator A5082385497 @default.
- W4285390037 creator A5083618993 @default.
- W4285390037 date "2022-07-27" @default.
- W4285390037 modified "2023-10-03" @default.
- W4285390037 title "Combining natural and artificial intelligence for robust automatic anatomy segmentation: Application in neck and thorax auto‐contouring" @default.
- W4285390037 cites W1563961677 @default.
- W4285390037 cites W1964162596 @default.
- W4285390037 cites W1975111110 @default.
- W4285390037 cites W1993850931 @default.
- W4285390037 cites W2013284520 @default.
- W4285390037 cites W2015513598 @default.
- W4285390037 cites W2025564919 @default.
- W4285390037 cites W2038952578 @default.
- W4285390037 cites W2044776246 @default.
- W4285390037 cites W2046622279 @default.
- W4285390037 cites W2059894202 @default.
- W4285390037 cites W2065698093 @default.
- W4285390037 cites W2069456254 @default.
- W4285390037 cites W2075358081 @default.
- W4285390037 cites W2083099567 @default.
- W4285390037 cites W2085091083 @default.
- W4285390037 cites W2097137218 @default.
- W4285390037 cites W2104095591 @default.
- W4285390037 cites W2107770500 @default.
- W4285390037 cites W2110161289 @default.
- W4285390037 cites W2112884386 @default.
- W4285390037 cites W2113622874 @default.
- W4285390037 cites W2114487471 @default.
- W4285390037 cites W2115235609 @default.
- W4285390037 cites W2118386984 @default.
- W4285390037 cites W2118924626 @default.
- W4285390037 cites W2132603077 @default.
- W4285390037 cites W2143516773 @default.
- W4285390037 cites W2145803225 @default.
- W4285390037 cites W2148107745 @default.
- W4285390037 cites W2149184914 @default.
- W4285390037 cites W2153798798 @default.
- W4285390037 cites W2194775991 @default.
- W4285390037 cites W2225435795 @default.
- W4285390037 cites W2282928421 @default.
- W4285390037 cites W2333768478 @default.
- W4285390037 cites W2344858100 @default.
- W4285390037 cites W2585890928 @default.
- W4285390037 cites W2588062088 @default.
- W4285390037 cites W2589644515 @default.
- W4285390037 cites W2590099728 @default.
- W4285390037 cites W2592605422 @default.
- W4285390037 cites W2593013519 @default.
- W4285390037 cites W2732931556 @default.
- W4285390037 cites W2755147576 @default.
- W4285390037 cites W2763160469 @default.
- W4285390037 cites W2770706469 @default.
- W4285390037 cites W2771252144 @default.
- W4285390037 cites W2773960327 @default.
- W4285390037 cites W2790270334 @default.
- W4285390037 cites W2883514733 @default.
- W4285390037 cites W2884561390 @default.
- W4285390037 cites W2891511539 @default.
- W4285390037 cites W2900677237 @default.
- W4285390037 cites W2901559346 @default.
- W4285390037 cites W2909481502 @default.
- W4285390037 cites W2912564039 @default.
- W4285390037 cites W2914733968 @default.
- W4285390037 cites W2946637494 @default.
- W4285390037 cites W2955058313 @default.
- W4285390037 cites W2962818306 @default.
- W4285390037 cites W2962900979 @default.
- W4285390037 cites W2963535787 @default.