Matches in SemOpenAlex for { <https://semopenalex.org/work/W4313194737> ?p ?o ?g. }
- W4313194737 endingPage "122301" @default.
- W4313194737 startingPage "122286" @default.
- W4313194737 abstract "Image denoising is a highly challenging problem yet important task in image processing. Recently, many CNN-based denoising methods have made great performances but they commonly denoise blindly texture and non-texture regions together. This frequently leads to excessive texture smoothing and detail loss. To address this issue, we propose a novel region adaptive denoising network that adjusts the denoising strength according to region textureness. The proposed network conducts denoising tasks for texture and non-texture areas independently to improve the visual quality of the resulting image. To this end, we first generate a texture map that separates the image into texture and non-texture region. Because the difference between texture and non-texture is more evident in the frequency domain than in the spatial domain, the classification is performed through discrete cosine transform (DCT). Second, guided by the texture map, denoising is performed independently in two subnets, corresponding to the texture and non-texture regions. This allows the texture subnet to avoid excessive smoothing of high frequency details, and the non-texture subnet to maximize noise reduction in flat regions. Finally, a cross fusion that takes into intra- and inter-relationship between two resulting features is proposed. The cross fusion highlights the discriminant features from two subnets without degradation when combining the output of two subnets, and thus helps enhancing the performance of regions adaptive denoising. The superiority of the proposed method is validated on both synthetic and real-world images. We demonstrate that our method outperforms the existing methods in both objective scores and subjective image quality, in particular showing outstanding results in the restoration of visually sensitive textures. Furthermore, ablation study shows that our network can adaptively control the noise removal strength by manually manipulating the texture map and that the details of the texture region can be further improved. This also can simplify the cumbersome noise tuning process when deploying deep neural networks (DNN) architectures into products." @default.
- W4313194737 created "2023-01-06" @default.
- W4313194737 creator A5024668714 @default.
- W4313194737 creator A5027438574 @default.
- W4313194737 creator A5029962556 @default.
- W4313194737 creator A5058792290 @default.
- W4313194737 creator A5076095055 @default.
- W4313194737 date "2022-01-01" @default.
- W4313194737 modified "2023-09-27" @default.
- W4313194737 title "Deep Region Adaptive Denoising for Texture Enhancement" @default.
- W4313194737 cites W1584663654 @default.
- W4313194737 cites W1978749115 @default.
- W4313194737 cites W1993205988 @default.
- W4313194737 cites W2037642501 @default.
- W4313194737 cites W2047920195 @default.
- W4313194737 cites W2048695508 @default.
- W4313194737 cites W2056370875 @default.
- W4313194737 cites W2121927366 @default.
- W4313194737 cites W2550553598 @default.
- W4313194737 cites W2556872594 @default.
- W4313194737 cites W2576300543 @default.
- W4313194737 cites W2741137940 @default.
- W4313194737 cites W2752782242 @default.
- W4313194737 cites W2914992179 @default.
- W4313194737 cites W2955058313 @default.
- W4313194737 cites W2963393688 @default.
- W4313194737 cites W2963686971 @default.
- W4313194737 cites W2963725279 @default.
- W4313194737 cites W2964046397 @default.
- W4313194737 cites W2964125708 @default.
- W4313194737 cites W2971140156 @default.
- W4313194737 cites W2971719842 @default.
- W4313194737 cites W2974915688 @default.
- W4313194737 cites W2986670728 @default.
- W4313194737 cites W3000775737 @default.
- W4313194737 cites W3002005943 @default.
- W4313194737 cites W3009428327 @default.
- W4313194737 cites W3034504121 @default.
- W4313194737 cites W3034598912 @default.
- W4313194737 cites W3035356612 @default.
- W4313194737 cites W3035588244 @default.
- W4313194737 cites W3104725225 @default.
- W4313194737 cites W3137277224 @default.
- W4313194737 cites W3151130473 @default.
- W4313194737 cites W3167738729 @default.
- W4313194737 cites W3171125843 @default.
- W4313194737 cites W3182000414 @default.
- W4313194737 cites W3194730817 @default.
- W4313194737 cites W3195061344 @default.
- W4313194737 cites W3200670215 @default.
- W4313194737 cites W4205453461 @default.
- W4313194737 cites W4225672218 @default.
- W4313194737 cites W4226290276 @default.
- W4313194737 cites W4242059867 @default.
- W4313194737 cites W4296551003 @default.
- W4313194737 doi "https://doi.org/10.1109/access.2022.3222826" @default.
- W4313194737 hasPublicationYear "2022" @default.
- W4313194737 type Work @default.
- W4313194737 citedByCount "0" @default.
- W4313194737 crossrefType "journal-article" @default.
- W4313194737 hasAuthorship W4313194737A5024668714 @default.
- W4313194737 hasAuthorship W4313194737A5027438574 @default.
- W4313194737 hasAuthorship W4313194737A5029962556 @default.
- W4313194737 hasAuthorship W4313194737A5058792290 @default.
- W4313194737 hasAuthorship W4313194737A5076095055 @default.
- W4313194737 hasBestOaLocation W43131947371 @default.
- W4313194737 hasConcept C115961682 @default.
- W4313194737 hasConcept C144743038 @default.
- W4313194737 hasConcept C153180895 @default.
- W4313194737 hasConcept C154945302 @default.
- W4313194737 hasConcept C163294075 @default.
- W4313194737 hasConcept C2221639 @default.
- W4313194737 hasConcept C2781195486 @default.
- W4313194737 hasConcept C31972630 @default.
- W4313194737 hasConcept C3770464 @default.
- W4313194737 hasConcept C41008148 @default.
- W4313194737 hasConcept C54243161 @default.
- W4313194737 hasConcept C63099799 @default.
- W4313194737 hasConcept C9417928 @default.
- W4313194737 hasConcept C99498987 @default.
- W4313194737 hasConceptScore W4313194737C115961682 @default.
- W4313194737 hasConceptScore W4313194737C144743038 @default.
- W4313194737 hasConceptScore W4313194737C153180895 @default.
- W4313194737 hasConceptScore W4313194737C154945302 @default.
- W4313194737 hasConceptScore W4313194737C163294075 @default.
- W4313194737 hasConceptScore W4313194737C2221639 @default.
- W4313194737 hasConceptScore W4313194737C2781195486 @default.
- W4313194737 hasConceptScore W4313194737C31972630 @default.
- W4313194737 hasConceptScore W4313194737C3770464 @default.
- W4313194737 hasConceptScore W4313194737C41008148 @default.
- W4313194737 hasConceptScore W4313194737C54243161 @default.
- W4313194737 hasConceptScore W4313194737C63099799 @default.
- W4313194737 hasConceptScore W4313194737C9417928 @default.
- W4313194737 hasConceptScore W4313194737C99498987 @default.
- W4313194737 hasFunder F4320321260 @default.
- W4313194737 hasLocation W43131947371 @default.
- W4313194737 hasLocation W43131947372 @default.
- W4313194737 hasOpenAccess W4313194737 @default.