Matches in SemOpenAlex for { <https://semopenalex.org/work/W4318832382> ?p ?o ?g. }
- W4318832382 endingPage "184" @default.
- W4318832382 startingPage "184" @default.
- W4318832382 abstract "Diagnostic results can be radically influenced by the quality of 2D ovarian-tumor ultrasound images. However, clinically processed 2D ovarian-tumor ultrasound images contain many artificially recognized symbols, such as fingers, crosses, dashed lines, and letters which assist artificial intelligence (AI) in image recognition. These symbols are widely distributed within the lesion’s boundary, which can also affect the useful feature-extraction-utilizing networks and thus decrease the accuracy of lesion classification and segmentation. Image inpainting techniques are used for noise and object elimination from images. To solve this problem, we observed the MMOTU dataset and built a 2D ovarian-tumor ultrasound image inpainting dataset by finely annotating the various symbols in the images. A novel framework called mask-guided generative adversarial network (MGGAN) is presented in this paper for 2D ovarian-tumor ultrasound images to remove various symbols from the images. The MGGAN performs to a high standard in corrupted regions by using an attention mechanism in the generator to pay more attention to valid information and ignore symbol information, making lesion boundaries more realistic. Moreover, fast Fourier convolutions (FFCs) and residual networks are used to increase the global field of perception; thus, our model can be applied to high-resolution ultrasound images. The greatest benefit of this algorithm is that it achieves pixel-level inpainting of distorted regions without clean images. Compared with other models, our model achieveed better results with only one stage in terms of objective and subjective evaluations. Our model obtained the best results for 256 × 256 and 512 × 512 resolutions. At a resolution of 256 × 256, our model achieved 0.9246 for SSIM, 22.66 for FID, and 0.07806 for LPIPS. At a resolution of 512 × 512, our model achieved 0.9208 for SSIM, 25.52 for FID, and 0.08300 for LPIPS. Our method can considerably improve the accuracy of computerized ovarian tumor diagnosis. The segmentation accuracy was improved from 71.51% to 76.06% for the Unet model and from 61.13% to 66.65% for the PSPnet model in clean images." @default.
- W4318832382 created "2023-02-02" @default.
- W4318832382 creator A5015683050 @default.
- W4318832382 creator A5020090742 @default.
- W4318832382 creator A5028084698 @default.
- W4318832382 creator A5040437713 @default.
- W4318832382 creator A5048820407 @default.
- W4318832382 creator A5052443277 @default.
- W4318832382 creator A5087479926 @default.
- W4318832382 creator A5091772831 @default.
- W4318832382 date "2023-02-01" @default.
- W4318832382 modified "2023-10-01" @default.
- W4318832382 title "Improving the Segmentation Accuracy of Ovarian-Tumor Ultrasound Images Using Image Inpainting" @default.
- W4318832382 cites W1901129140 @default.
- W4318832382 cites W2074977333 @default.
- W4318832382 cites W2133665775 @default.
- W4318832382 cites W2194775991 @default.
- W4318832382 cites W2208471067 @default.
- W4318832382 cites W2331128040 @default.
- W4318832382 cites W2507486309 @default.
- W4318832382 cites W2560023338 @default.
- W4318832382 cites W2738588019 @default.
- W4318832382 cites W2794022343 @default.
- W4318832382 cites W2798365772 @default.
- W4318832382 cites W2807190811 @default.
- W4318832382 cites W2892004104 @default.
- W4318832382 cites W2897598705 @default.
- W4318832382 cites W2955805697 @default.
- W4318832382 cites W2959828872 @default.
- W4318832382 cites W2962785568 @default.
- W4318832382 cites W2963231084 @default.
- W4318832382 cites W2963270367 @default.
- W4318832382 cites W2963420272 @default.
- W4318832382 cites W2963800363 @default.
- W4318832382 cites W2964250774 @default.
- W4318832382 cites W2991372685 @default.
- W4318832382 cites W2991377405 @default.
- W4318832382 cites W2998957378 @default.
- W4318832382 cites W3010467706 @default.
- W4318832382 cites W3016488464 @default.
- W4318832382 cites W3034419329 @default.
- W4318832382 cites W3036319923 @default.
- W4318832382 cites W3036375271 @default.
- W4318832382 cites W3043547428 @default.
- W4318832382 cites W3096898300 @default.
- W4318832382 cites W3111390112 @default.
- W4318832382 cites W3132080514 @default.
- W4318832382 cites W3195025823 @default.
- W4318832382 cites W3196729209 @default.
- W4318832382 cites W3199003182 @default.
- W4318832382 cites W4205942815 @default.
- W4318832382 cites W4283216064 @default.
- W4318832382 cites W4293526483 @default.
- W4318832382 cites W4295936768 @default.
- W4318832382 doi "https://doi.org/10.3390/bioengineering10020184" @default.
- W4318832382 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/36829679" @default.
- W4318832382 hasPublicationYear "2023" @default.
- W4318832382 type Work @default.
- W4318832382 citedByCount "1" @default.
- W4318832382 countsByYear W43188323822023 @default.
- W4318832382 crossrefType "journal-article" @default.
- W4318832382 hasAuthorship W4318832382A5015683050 @default.
- W4318832382 hasAuthorship W4318832382A5020090742 @default.
- W4318832382 hasAuthorship W4318832382A5028084698 @default.
- W4318832382 hasAuthorship W4318832382A5040437713 @default.
- W4318832382 hasAuthorship W4318832382A5048820407 @default.
- W4318832382 hasAuthorship W4318832382A5052443277 @default.
- W4318832382 hasAuthorship W4318832382A5087479926 @default.
- W4318832382 hasAuthorship W4318832382A5091772831 @default.
- W4318832382 hasBestOaLocation W43188323821 @default.
- W4318832382 hasConcept C11413529 @default.
- W4318832382 hasConcept C115961682 @default.
- W4318832382 hasConcept C11727466 @default.
- W4318832382 hasConcept C121608353 @default.
- W4318832382 hasConcept C126322002 @default.
- W4318832382 hasConcept C138885662 @default.
- W4318832382 hasConcept C153180895 @default.
- W4318832382 hasConcept C154945302 @default.
- W4318832382 hasConcept C155512373 @default.
- W4318832382 hasConcept C180940675 @default.
- W4318832382 hasConcept C2776401178 @default.
- W4318832382 hasConcept C2777423100 @default.
- W4318832382 hasConcept C2780472235 @default.
- W4318832382 hasConcept C31972630 @default.
- W4318832382 hasConcept C41008148 @default.
- W4318832382 hasConcept C41895202 @default.
- W4318832382 hasConcept C52622490 @default.
- W4318832382 hasConcept C530470458 @default.
- W4318832382 hasConcept C71924100 @default.
- W4318832382 hasConcept C89600930 @default.
- W4318832382 hasConceptScore W4318832382C11413529 @default.
- W4318832382 hasConceptScore W4318832382C115961682 @default.
- W4318832382 hasConceptScore W4318832382C11727466 @default.
- W4318832382 hasConceptScore W4318832382C121608353 @default.
- W4318832382 hasConceptScore W4318832382C126322002 @default.
- W4318832382 hasConceptScore W4318832382C138885662 @default.
- W4318832382 hasConceptScore W4318832382C153180895 @default.
- W4318832382 hasConceptScore W4318832382C154945302 @default.