Matches in SemOpenAlex for { <https://semopenalex.org/work/W4285160408> ?p ?o ?g. }
- W4285160408 endingPage "55532" @default.
- W4285160408 startingPage "55522" @default.
- W4285160408 abstract "Diabetic Retinopathy (DR) is a disease caused by a high level of glucose in retina vessels. This malicious disease put millions of people around the world at risk for vision loss each year. Being a life-threatening disease, early diagnosis can be an effective step in the treatment and prevention of vision loss. To automate the early diagnosis process, computer-aided diagnosis methods are not only useful in detecting the diabetic signatures but also provide information regarding the diabetic grade for the optometrist to determine an appropriate treatment. Several deep classification models are proposed in the literature to solve the diabetic retinopathy classification task, however, these methods usually lack incorporate an attention mechanism to better encode the semantic dependency and highlight the most important region for boosting the model performance. To overcome these limitations, we propose to incorporate a style and content recalibration mechanism inside the deep neural network to adaptively scale the informative regions for diabetic retinopathy classification. In our proposed method, the input image passes through the encoder module to encode both high-level and semantic features. Next, by utilizing a content and style separation mechanism, we decompose the representational space into a style (e.g., texture features) and content (e.g., semantic and contextual features) representation. The texture attention module takes the style representation and applies a high-pass filter to highlight the texture information while the spatial normalization module uses a convolutional operation to determine the more informative region inside the retinopathy image to detect diabetic signs. Once the attention modules are applied to the representational features, the fusion module combines both features to form a normalized representation for the decoding path. The decoder module in our model performs both diabetic grading and healthy, non-healthy classification tasks. Our experiment on APTOS Kaggle dataset (accuracy 0.85) demonstrates a significant improvement compared to the literature work. This fact reveals the applicability of our method in a real-world scenario." @default.
- W4285160408 created "2022-07-14" @default.
- W4285160408 creator A5072138648 @default.
- W4285160408 date "2022-01-01" @default.
- W4285160408 modified "2023-10-17" @default.
- W4285160408 title "Texture Attention Network for Diabetic Retinopathy Classification" @default.
- W4285160408 cites W1903029394 @default.
- W4285160408 cites W1982168774 @default.
- W4285160408 cites W2010099973 @default.
- W4285160408 cites W2097117768 @default.
- W4285160408 cites W2114063873 @default.
- W4285160408 cites W2145491883 @default.
- W4285160408 cites W2157984366 @default.
- W4285160408 cites W2166460195 @default.
- W4285160408 cites W2169961704 @default.
- W4285160408 cites W2183341477 @default.
- W4285160408 cites W2194775991 @default.
- W4285160408 cites W2412782625 @default.
- W4285160408 cites W2531409750 @default.
- W4285160408 cites W2575748305 @default.
- W4285160408 cites W2752747624 @default.
- W4285160408 cites W2755665484 @default.
- W4285160408 cites W2769713325 @default.
- W4285160408 cites W2791447208 @default.
- W4285160408 cites W2806860374 @default.
- W4285160408 cites W2886848602 @default.
- W4285160408 cites W2906424845 @default.
- W4285160408 cites W2906933145 @default.
- W4285160408 cites W2924509510 @default.
- W4285160408 cites W2952436003 @default.
- W4285160408 cites W2953029621 @default.
- W4285160408 cites W2963292306 @default.
- W4285160408 cites W2963881378 @default.
- W4285160408 cites W2964118901 @default.
- W4285160408 cites W2966362431 @default.
- W4285160408 cites W2987039128 @default.
- W4285160408 cites W3004535422 @default.
- W4285160408 cites W3008531067 @default.
- W4285160408 cites W3084156874 @default.
- W4285160408 cites W3090496569 @default.
- W4285160408 cites W3103855452 @default.
- W4285160408 cites W3104610662 @default.
- W4285160408 cites W3120995853 @default.
- W4285160408 cites W3153976901 @default.
- W4285160408 cites W3194982898 @default.
- W4285160408 cites W3202088472 @default.
- W4285160408 cites W3203565236 @default.
- W4285160408 cites W4210538228 @default.
- W4285160408 doi "https://doi.org/10.1109/access.2022.3177651" @default.
- W4285160408 hasPublicationYear "2022" @default.
- W4285160408 type Work @default.
- W4285160408 citedByCount "10" @default.
- W4285160408 countsByYear W42851604082022 @default.
- W4285160408 countsByYear W42851604082023 @default.
- W4285160408 crossrefType "journal-article" @default.
- W4285160408 hasAuthorship W4285160408A5072138648 @default.
- W4285160408 hasBestOaLocation W42851604081 @default.
- W4285160408 hasConcept C115961682 @default.
- W4285160408 hasConcept C134018914 @default.
- W4285160408 hasConcept C136886441 @default.
- W4285160408 hasConcept C144024400 @default.
- W4285160408 hasConcept C153180895 @default.
- W4285160408 hasConcept C154945302 @default.
- W4285160408 hasConcept C19165224 @default.
- W4285160408 hasConcept C2779829184 @default.
- W4285160408 hasConcept C31972630 @default.
- W4285160408 hasConcept C41008148 @default.
- W4285160408 hasConcept C52622490 @default.
- W4285160408 hasConcept C555293320 @default.
- W4285160408 hasConcept C71924100 @default.
- W4285160408 hasConcept C75294576 @default.
- W4285160408 hasConcept C81363708 @default.
- W4285160408 hasConceptScore W4285160408C115961682 @default.
- W4285160408 hasConceptScore W4285160408C134018914 @default.
- W4285160408 hasConceptScore W4285160408C136886441 @default.
- W4285160408 hasConceptScore W4285160408C144024400 @default.
- W4285160408 hasConceptScore W4285160408C153180895 @default.
- W4285160408 hasConceptScore W4285160408C154945302 @default.
- W4285160408 hasConceptScore W4285160408C19165224 @default.
- W4285160408 hasConceptScore W4285160408C2779829184 @default.
- W4285160408 hasConceptScore W4285160408C31972630 @default.
- W4285160408 hasConceptScore W4285160408C41008148 @default.
- W4285160408 hasConceptScore W4285160408C52622490 @default.
- W4285160408 hasConceptScore W4285160408C555293320 @default.
- W4285160408 hasConceptScore W4285160408C71924100 @default.
- W4285160408 hasConceptScore W4285160408C75294576 @default.
- W4285160408 hasConceptScore W4285160408C81363708 @default.
- W4285160408 hasLocation W42851604081 @default.
- W4285160408 hasOpenAccess W4285160408 @default.
- W4285160408 hasPrimaryLocation W42851604081 @default.
- W4285160408 hasRelatedWork W1540444031 @default.
- W4285160408 hasRelatedWork W2406522397 @default.
- W4285160408 hasRelatedWork W2533072256 @default.
- W4285160408 hasRelatedWork W2738461075 @default.
- W4285160408 hasRelatedWork W2767651786 @default.
- W4285160408 hasRelatedWork W2912288872 @default.
- W4285160408 hasRelatedWork W2940977206 @default.
- W4285160408 hasRelatedWork W3005023910 @default.