Matches in SemOpenAlex for { <https://semopenalex.org/work/W4285992128> ?p ?o ?g. }
- W4285992128 endingPage "103997" @default.
- W4285992128 startingPage "103997" @default.
- W4285992128 abstract "Skin cancer is the most common cancer worldwide, and therein the malignant melanoma may lead to less than 5-year life expectancy. Via early-stage detection and recognition, even the deadliest melanoma can be cured to greatly increase the patient’s survival rate. Recently dermoscopy imaging is capable of capturing high-resolution magnified images of the infected skin region to automatic lesion classification, and deep learning network has been witnessed great potential of accurately recognizing different types of skin lesions. This study aims to exploit a novel deep model to enhance the skin lesion recognition performance. In spite of the remarkable progress, the existing deep network based methods naively deploy the proposed network architectures in generic image classification to the skin lesion classification, and there has still large space for performance improvement. This study presents an enhanced deep bottleneck transformer model, which incorporates self-attention to model the global correlation of the extracted features from the conventional deep models, for boosting the skin lesion performance. Specifically, we exploit an enhanced transformer module via incorporating a dual position encoding module to integrate encoded position vector on both key and query vectors for balance learning. By replacing the bottleneck spatial convolutions of the late-stage blocks in the baseline deep networks with the enhanced module, we construct a novel deep skin lesion classification model to lift the skin lesion classification performance. We conduct extensive experiments on two benchmark skin lesion datasets: ISIC2017 and HAM10000 to verify the recognition performance of different deep models. The three quantitative metrics of accuracy, sensitivity and specificity on the ISIC2017 dataset with our method reach to 92.1%, 90.1% and 91.9%, respectively, which manifests very good balance result between the sensitivity and specificity, while the results on the accuracy and precision for the HAM10000 dataset are 95.84% and 96.1%. Results on both datasets have demonstrated that our proposed model can achieve superior performance over the baseline models as well as the state-of-the-art methods. This superior results using the incorporated model of the transformer of convolution module would inspire further research on the wide application of the transformer-based block for the real scenario without large-scale dataset." @default.
- W4285992128 created "2022-07-21" @default.
- W4285992128 creator A5004305504 @default.
- W4285992128 creator A5044216245 @default.
- W4285992128 creator A5086851360 @default.
- W4285992128 date "2022-09-01" @default.
- W4285992128 modified "2023-10-16" @default.
- W4285992128 title "Enhanced deep bottleneck transformer model for skin lesion classification" @default.
- W4285992128 cites W1572889567 @default.
- W4285992128 cites W1981650444 @default.
- W4285992128 cites W1982704641 @default.
- W4285992128 cites W2022306670 @default.
- W4285992128 cites W2038781708 @default.
- W4285992128 cites W2040600853 @default.
- W4285992128 cites W2049209034 @default.
- W4285992128 cites W2061576204 @default.
- W4285992128 cites W2096031071 @default.
- W4285992128 cites W2143164411 @default.
- W4285992128 cites W2152950860 @default.
- W4285992128 cites W2164273268 @default.
- W4285992128 cites W2194775991 @default.
- W4285992128 cites W2413794162 @default.
- W4285992128 cites W2426942631 @default.
- W4285992128 cites W2752782242 @default.
- W4285992128 cites W2884585870 @default.
- W4285992128 cites W2895340641 @default.
- W4285992128 cites W2914959431 @default.
- W4285992128 cites W2949676527 @default.
- W4285992128 cites W2955058313 @default.
- W4285992128 cites W2963059730 @default.
- W4285992128 cites W2963446712 @default.
- W4285992128 cites W2963495494 @default.
- W4285992128 cites W2963954913 @default.
- W4285992128 cites W2981413347 @default.
- W4285992128 cites W2983446232 @default.
- W4285992128 cites W3006694830 @default.
- W4285992128 cites W3034885317 @default.
- W4285992128 cites W3097065222 @default.
- W4285992128 cites W3102785203 @default.
- W4285992128 cites W3134084943 @default.
- W4285992128 cites W3172509117 @default.
- W4285992128 doi "https://doi.org/10.1016/j.bspc.2022.103997" @default.
- W4285992128 hasPublicationYear "2022" @default.
- W4285992128 type Work @default.
- W4285992128 citedByCount "7" @default.
- W4285992128 countsByYear W42859921282023 @default.
- W4285992128 crossrefType "journal-article" @default.
- W4285992128 hasAuthorship W4285992128A5004305504 @default.
- W4285992128 hasAuthorship W4285992128A5044216245 @default.
- W4285992128 hasAuthorship W4285992128A5086851360 @default.
- W4285992128 hasConcept C108583219 @default.
- W4285992128 hasConcept C119857082 @default.
- W4285992128 hasConcept C121608353 @default.
- W4285992128 hasConcept C126322002 @default.
- W4285992128 hasConcept C142724271 @default.
- W4285992128 hasConcept C149635348 @default.
- W4285992128 hasConcept C153180895 @default.
- W4285992128 hasConcept C154945302 @default.
- W4285992128 hasConcept C165696696 @default.
- W4285992128 hasConcept C2777789703 @default.
- W4285992128 hasConcept C2780513914 @default.
- W4285992128 hasConcept C2781156865 @default.
- W4285992128 hasConcept C38652104 @default.
- W4285992128 hasConcept C41008148 @default.
- W4285992128 hasConcept C46686674 @default.
- W4285992128 hasConcept C71924100 @default.
- W4285992128 hasConceptScore W4285992128C108583219 @default.
- W4285992128 hasConceptScore W4285992128C119857082 @default.
- W4285992128 hasConceptScore W4285992128C121608353 @default.
- W4285992128 hasConceptScore W4285992128C126322002 @default.
- W4285992128 hasConceptScore W4285992128C142724271 @default.
- W4285992128 hasConceptScore W4285992128C149635348 @default.
- W4285992128 hasConceptScore W4285992128C153180895 @default.
- W4285992128 hasConceptScore W4285992128C154945302 @default.
- W4285992128 hasConceptScore W4285992128C165696696 @default.
- W4285992128 hasConceptScore W4285992128C2777789703 @default.
- W4285992128 hasConceptScore W4285992128C2780513914 @default.
- W4285992128 hasConceptScore W4285992128C2781156865 @default.
- W4285992128 hasConceptScore W4285992128C38652104 @default.
- W4285992128 hasConceptScore W4285992128C41008148 @default.
- W4285992128 hasConceptScore W4285992128C46686674 @default.
- W4285992128 hasConceptScore W4285992128C71924100 @default.
- W4285992128 hasLocation W42859921281 @default.
- W4285992128 hasOpenAccess W4285992128 @default.
- W4285992128 hasPrimaryLocation W42859921281 @default.
- W4285992128 hasRelatedWork W3014300295 @default.
- W4285992128 hasRelatedWork W3164822677 @default.
- W4285992128 hasRelatedWork W4223943233 @default.
- W4285992128 hasRelatedWork W4225161397 @default.
- W4285992128 hasRelatedWork W4250304930 @default.
- W4285992128 hasRelatedWork W4312200629 @default.
- W4285992128 hasRelatedWork W4360585206 @default.
- W4285992128 hasRelatedWork W4364306694 @default.
- W4285992128 hasRelatedWork W4380075502 @default.
- W4285992128 hasRelatedWork W4380086463 @default.
- W4285992128 hasVolume "78" @default.
- W4285992128 isParatext "false" @default.
- W4285992128 isRetracted "false" @default.