Matches in SemOpenAlex for { <https://semopenalex.org/work/W4387402761> ?p ?o ?g. }
- W4387402761 abstract "Background Gastric cancer is a highly prevalent and fatal disease. Accurate differentiation between early gastric cancer (EGC) and advanced gastric cancer (AGC) is essential for personalized treatment. Currently, the diagnostic accuracy of computerized tomography (CT) for gastric cancer staging is insufficient to meet clinical requirements. Many studies rely on manual marking of lesion areas, which is not suitable for clinical diagnosis. Methods In this study, we retrospectively collected data from 341 patients with gastric cancer at the First Affiliated Hospital of Wenzhou Medical University. The dataset was randomly divided into a training set (n=273) and a validation set (n=68) using an 8:2 ratio. We developed a two-stage deep learning model that enables fully automated EGC screening based on CT images. In the first stage, an unsupervised domain adaptive segmentation model was employed to automatically segment the stomach on unlabeled portal phase CT images. Subsequently, based on the results of the stomach segmentation model, the image was cropped out of the stomach area and scaled to a uniform size, and then the EGC and AGC classification models were built based on these images. The segmentation accuracy of the model was evaluated using the dice index, while the classification performance was assessed using metrics such as the area under the curve (AUC) of the receiver operating characteristic (ROC), accuracy, sensitivity, specificity, and F1 score. Results The segmentation model achieved an average dice accuracy of 0.94 on the hand-segmented validation set. On the training set, the EGC screening model demonstrated an AUC, accuracy, sensitivity, specificity, and F1 score of 0.98, 0.93, 0.92, 0.92, and 0.93, respectively. On the validation set, these metrics were 0.96, 0.92, 0.90, 0.89, and 0.93, respectively. After three rounds of data regrouping, the model consistently achieved an AUC above 0.9 on both the validation set and the validation set. Conclusion The results of this study demonstrate that the proposed method can effectively screen for EGC in portal venous CT images. Furthermore, the model exhibits stability and holds promise for future clinical applications." @default.
- W4387402761 created "2023-10-07" @default.
- W4387402761 creator A5004515162 @default.
- W4387402761 creator A5007878481 @default.
- W4387402761 creator A5027339105 @default.
- W4387402761 creator A5031804038 @default.
- W4387402761 creator A5034654778 @default.
- W4387402761 creator A5053573702 @default.
- W4387402761 creator A5063141445 @default.
- W4387402761 creator A5074141895 @default.
- W4387402761 creator A5085088788 @default.
- W4387402761 creator A5088316353 @default.
- W4387402761 date "2023-10-06" @default.
- W4387402761 modified "2023-10-07" @default.
- W4387402761 title "Development of a deep learning model for early gastric cancer diagnosis using preoperative computed tomography images" @default.
- W4387402761 cites W2036841150 @default.
- W4387402761 cites W2119939308 @default.
- W4387402761 cites W2464708700 @default.
- W4387402761 cites W2621233965 @default.
- W4387402761 cites W2806676728 @default.
- W4387402761 cites W2889646458 @default.
- W4387402761 cites W2912290257 @default.
- W4387402761 cites W2913054914 @default.
- W4387402761 cites W2962793481 @default.
- W4387402761 cites W2977240018 @default.
- W4387402761 cites W2983817809 @default.
- W4387402761 cites W3033366394 @default.
- W4387402761 cites W3048802680 @default.
- W4387402761 cites W3049757379 @default.
- W4387402761 cites W3091321893 @default.
- W4387402761 cites W3108932871 @default.
- W4387402761 cites W3119005666 @default.
- W4387402761 cites W3121908290 @default.
- W4387402761 cites W3133858468 @default.
- W4387402761 cites W3134475970 @default.
- W4387402761 cites W3175762528 @default.
- W4387402761 cites W3183176920 @default.
- W4387402761 cites W4200111739 @default.
- W4387402761 cites W4206988975 @default.
- W4387402761 cites W4206994254 @default.
- W4387402761 cites W4210352025 @default.
- W4387402761 cites W4210741977 @default.
- W4387402761 cites W4210990885 @default.
- W4387402761 cites W4220667190 @default.
- W4387402761 cites W4220749789 @default.
- W4387402761 cites W4220849944 @default.
- W4387402761 cites W4223644163 @default.
- W4387402761 cites W4280532785 @default.
- W4387402761 cites W4280550458 @default.
- W4387402761 cites W4281698007 @default.
- W4387402761 cites W4290659600 @default.
- W4387402761 cites W4292263747 @default.
- W4387402761 cites W4294168434 @default.
- W4387402761 cites W4298152235 @default.
- W4387402761 cites W4300089522 @default.
- W4387402761 cites W4303044063 @default.
- W4387402761 cites W4309040010 @default.
- W4387402761 cites W4311046065 @default.
- W4387402761 cites W4312749457 @default.
- W4387402761 cites W4362615850 @default.
- W4387402761 cites W4367182093 @default.
- W4387402761 cites W4382654777 @default.
- W4387402761 cites W4386066119 @default.
- W4387402761 doi "https://doi.org/10.3389/fonc.2023.1265366" @default.
- W4387402761 hasPublicationYear "2023" @default.
- W4387402761 type Work @default.
- W4387402761 citedByCount "0" @default.
- W4387402761 crossrefType "journal-article" @default.
- W4387402761 hasAuthorship W4387402761A5004515162 @default.
- W4387402761 hasAuthorship W4387402761A5007878481 @default.
- W4387402761 hasAuthorship W4387402761A5027339105 @default.
- W4387402761 hasAuthorship W4387402761A5031804038 @default.
- W4387402761 hasAuthorship W4387402761A5034654778 @default.
- W4387402761 hasAuthorship W4387402761A5053573702 @default.
- W4387402761 hasAuthorship W4387402761A5063141445 @default.
- W4387402761 hasAuthorship W4387402761A5074141895 @default.
- W4387402761 hasAuthorship W4387402761A5085088788 @default.
- W4387402761 hasAuthorship W4387402761A5088316353 @default.
- W4387402761 hasBestOaLocation W43874027611 @default.
- W4387402761 hasConcept C121608353 @default.
- W4387402761 hasConcept C124504099 @default.
- W4387402761 hasConcept C126322002 @default.
- W4387402761 hasConcept C126838900 @default.
- W4387402761 hasConcept C146357865 @default.
- W4387402761 hasConcept C151730666 @default.
- W4387402761 hasConcept C153180895 @default.
- W4387402761 hasConcept C154945302 @default.
- W4387402761 hasConcept C163892561 @default.
- W4387402761 hasConcept C2779422922 @default.
- W4387402761 hasConcept C2779454504 @default.
- W4387402761 hasConcept C41008148 @default.
- W4387402761 hasConcept C544519230 @default.
- W4387402761 hasConcept C58471807 @default.
- W4387402761 hasConcept C58489278 @default.
- W4387402761 hasConcept C71924100 @default.
- W4387402761 hasConcept C86803240 @default.
- W4387402761 hasConcept C89600930 @default.
- W4387402761 hasConceptScore W4387402761C121608353 @default.
- W4387402761 hasConceptScore W4387402761C124504099 @default.
- W4387402761 hasConceptScore W4387402761C126322002 @default.