Matches in SemOpenAlex for { <https://semopenalex.org/work/W3210283953> ?p ?o ?g. }
Showing items 1 to 93 of
93
with 100 items per page.
- W3210283953 endingPage "e126" @default.
- W3210283953 startingPage "e126" @default.
- W3210283953 abstract "Purpose/Objective(s)Contouring organs at risk (OARs) is a key step of radiotherapy treatment planning. In this study, a deep learning-based framework was developed to automatize the whole OAR contouring process contributing to the clinical practice.Materials/MethodsThis auto-segmentation framework consisted of a pre-processing module, a deep learning-based segmentation module, and a post-processing module. We focused on the auto-segmentation in head and neck, and pelvis sites, and CT images with manual OAR contours from 67 and 54 patient cases were collected from our clinical database for the deep learning, respectively. 16 head and neck OARs and 11 pelvis OARs which were contoured in most patient cases were selected for auto-segmentation. In the pre-processing module, the original image data was extracted from the clinical DICOM Image file, and then processed for the segmentation network learning. A U-Net like network was designed in the auto-segmentation module, which was designed to consider the contour consistency among adjacent image slices and speed up the network convergency. Manual OAR contours were used as the ground truth for the network training, and a weighted Dice loss function was utilized to handle the missing organ contours in some patient cases. The segmentation results would be converted to a DCIOM Structure file in the post-processing module for treatment planning.ResultsThe accuracy of the current auto-segmentation framework in terms of the Dice similarity coefficient (DSC) is shown in the table. Once a clinical DICOM Image data is sent to the framework, a corresponding DICOM Structure file would be automatically generated containing the OAR segmentation results. The overall operation takes about 1-2 min per case without the need for additional manual interference.ConclusionA deep learning-based framework was developed for the auto-segmentation of head and neck, and pelvis OARs with promising contouring accuracy and operation time, which is now ready to be applied in our clinical practice. Moreover, this framework is being developed toward a physician-specific framework to comply with the contour style of each radiation oncologist separately. Contouring organs at risk (OARs) is a key step of radiotherapy treatment planning. In this study, a deep learning-based framework was developed to automatize the whole OAR contouring process contributing to the clinical practice. This auto-segmentation framework consisted of a pre-processing module, a deep learning-based segmentation module, and a post-processing module. We focused on the auto-segmentation in head and neck, and pelvis sites, and CT images with manual OAR contours from 67 and 54 patient cases were collected from our clinical database for the deep learning, respectively. 16 head and neck OARs and 11 pelvis OARs which were contoured in most patient cases were selected for auto-segmentation. In the pre-processing module, the original image data was extracted from the clinical DICOM Image file, and then processed for the segmentation network learning. A U-Net like network was designed in the auto-segmentation module, which was designed to consider the contour consistency among adjacent image slices and speed up the network convergency. Manual OAR contours were used as the ground truth for the network training, and a weighted Dice loss function was utilized to handle the missing organ contours in some patient cases. The segmentation results would be converted to a DCIOM Structure file in the post-processing module for treatment planning. The accuracy of the current auto-segmentation framework in terms of the Dice similarity coefficient (DSC) is shown in the table. Once a clinical DICOM Image data is sent to the framework, a corresponding DICOM Structure file would be automatically generated containing the OAR segmentation results. The overall operation takes about 1-2 min per case without the need for additional manual interference. A deep learning-based framework was developed for the auto-segmentation of head and neck, and pelvis OARs with promising contouring accuracy and operation time, which is now ready to be applied in our clinical practice. Moreover, this framework is being developed toward a physician-specific framework to comply with the contour style of each radiation oncologist separately." @default.
- W3210283953 created "2021-11-08" @default.
- W3210283953 creator A5008733898 @default.
- W3210283953 creator A5012581106 @default.
- W3210283953 creator A5023976830 @default.
- W3210283953 creator A5031290100 @default.
- W3210283953 creator A5039412958 @default.
- W3210283953 creator A5047821139 @default.
- W3210283953 creator A5048229838 @default.
- W3210283953 creator A5053147085 @default.
- W3210283953 creator A5053183388 @default.
- W3210283953 creator A5061512571 @default.
- W3210283953 date "2021-11-01" @default.
- W3210283953 modified "2023-09-26" @default.
- W3210283953 title "Development of an Automatic and Physician-Specific OAR Segmentation Framework for Radiotherapy Treatment Planning" @default.
- W3210283953 doi "https://doi.org/10.1016/j.ijrobp.2021.07.551" @default.
- W3210283953 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/34700735" @default.
- W3210283953 hasPublicationYear "2021" @default.
- W3210283953 type Work @default.
- W3210283953 sameAs 3210283953 @default.
- W3210283953 citedByCount "0" @default.
- W3210283953 crossrefType "journal-article" @default.
- W3210283953 hasAuthorship W3210283953A5008733898 @default.
- W3210283953 hasAuthorship W3210283953A5012581106 @default.
- W3210283953 hasAuthorship W3210283953A5023976830 @default.
- W3210283953 hasAuthorship W3210283953A5031290100 @default.
- W3210283953 hasAuthorship W3210283953A5039412958 @default.
- W3210283953 hasAuthorship W3210283953A5047821139 @default.
- W3210283953 hasAuthorship W3210283953A5048229838 @default.
- W3210283953 hasAuthorship W3210283953A5053147085 @default.
- W3210283953 hasAuthorship W3210283953A5053183388 @default.
- W3210283953 hasAuthorship W3210283953A5061512571 @default.
- W3210283953 hasBestOaLocation W32102839531 @default.
- W3210283953 hasConcept C108583219 @default.
- W3210283953 hasConcept C111919701 @default.
- W3210283953 hasConcept C121684516 @default.
- W3210283953 hasConcept C124504099 @default.
- W3210283953 hasConcept C126838900 @default.
- W3210283953 hasConcept C154945302 @default.
- W3210283953 hasConcept C163892561 @default.
- W3210283953 hasConcept C201645570 @default.
- W3210283953 hasConcept C22029948 @default.
- W3210283953 hasConcept C2524010 @default.
- W3210283953 hasConcept C2779104521 @default.
- W3210283953 hasConcept C31972630 @default.
- W3210283953 hasConcept C33923547 @default.
- W3210283953 hasConcept C41008148 @default.
- W3210283953 hasConcept C509974204 @default.
- W3210283953 hasConcept C71924100 @default.
- W3210283953 hasConcept C77331912 @default.
- W3210283953 hasConcept C89600930 @default.
- W3210283953 hasConcept C98045186 @default.
- W3210283953 hasConceptScore W3210283953C108583219 @default.
- W3210283953 hasConceptScore W3210283953C111919701 @default.
- W3210283953 hasConceptScore W3210283953C121684516 @default.
- W3210283953 hasConceptScore W3210283953C124504099 @default.
- W3210283953 hasConceptScore W3210283953C126838900 @default.
- W3210283953 hasConceptScore W3210283953C154945302 @default.
- W3210283953 hasConceptScore W3210283953C163892561 @default.
- W3210283953 hasConceptScore W3210283953C201645570 @default.
- W3210283953 hasConceptScore W3210283953C22029948 @default.
- W3210283953 hasConceptScore W3210283953C2524010 @default.
- W3210283953 hasConceptScore W3210283953C2779104521 @default.
- W3210283953 hasConceptScore W3210283953C31972630 @default.
- W3210283953 hasConceptScore W3210283953C33923547 @default.
- W3210283953 hasConceptScore W3210283953C41008148 @default.
- W3210283953 hasConceptScore W3210283953C509974204 @default.
- W3210283953 hasConceptScore W3210283953C71924100 @default.
- W3210283953 hasConceptScore W3210283953C77331912 @default.
- W3210283953 hasConceptScore W3210283953C89600930 @default.
- W3210283953 hasConceptScore W3210283953C98045186 @default.
- W3210283953 hasIssue "3" @default.
- W3210283953 hasLocation W32102839531 @default.
- W3210283953 hasLocation W32102839532 @default.
- W3210283953 hasOpenAccess W3210283953 @default.
- W3210283953 hasPrimaryLocation W32102839531 @default.
- W3210283953 hasRelatedWork W1669643531 @default.
- W3210283953 hasRelatedWork W2122581818 @default.
- W3210283953 hasRelatedWork W2630229246 @default.
- W3210283953 hasRelatedWork W2948658236 @default.
- W3210283953 hasRelatedWork W2973136608 @default.
- W3210283953 hasRelatedWork W2999580839 @default.
- W3210283953 hasRelatedWork W3135174555 @default.
- W3210283953 hasRelatedWork W3152950745 @default.
- W3210283953 hasRelatedWork W3210283953 @default.
- W3210283953 hasRelatedWork W4243168368 @default.
- W3210283953 hasVolume "111" @default.
- W3210283953 isParatext "false" @default.
- W3210283953 isRetracted "false" @default.
- W3210283953 magId "3210283953" @default.
- W3210283953 workType "article" @default.