Matches in SemOpenAlex for { <https://semopenalex.org/work/W3148908072> ?p ?o ?g. }
- W3148908072 endingPage "1590" @default.
- W3148908072 startingPage "1590" @default.
- W3148908072 abstract "Deep learning requires a large amount of data to perform well. However, the field of medical image analysis suffers from a lack of sufficient data for training deep learning models. Moreover, medical images require manual labeling, usually provided by human annotators coming from various backgrounds. More importantly, the annotation process is time-consuming, expensive, and prone to errors. Transfer learning was introduced to reduce the need for the annotation process by transferring the deep learning models with knowledge from a previous task and then by fine-tuning them on a relatively small dataset of the current task. Most of the methods of medical image classification employ transfer learning from pretrained models, e.g., ImageNet, which has been proven to be ineffective. This is due to the mismatch in learned features between the natural image, e.g., ImageNet, and medical images. Additionally, it results in the utilization of deeply elaborated models. In this paper, we propose a novel transfer learning approach to overcome the previous drawbacks by means of training the deep learning model on large unlabeled medical image datasets and by next transferring the knowledge to train the deep learning model on the small amount of labeled medical images. Additionally, we propose a new deep convolutional neural network (DCNN) model that combines recent advancements in the field. We conducted several experiments on two challenging medical imaging scenarios dealing with skin and breast cancer classification tasks. According to the reported results, it has been empirically proven that the proposed approach can significantly improve the performance of both classification scenarios. In terms of skin cancer, the proposed model achieved an F1-score value of 89.09% when trained from scratch and 98.53% with the proposed approach. Secondly, it achieved an accuracy value of 85.29% and 97.51%, respectively, when trained from scratch and using the proposed approach in the case of the breast cancer scenario. Finally, we concluded that our method can possibly be applied to many medical imaging problems in which a substantial amount of unlabeled image data is available and the labeled image data is limited. Moreover, it can be utilized to improve the performance of medical imaging tasks in the same domain. To do so, we used the pretrained skin cancer model to train on feet skin to classify them into two classes—either normal or abnormal (diabetic foot ulcer (DFU)). It achieved an F1-score value of 86.0% when trained from scratch, 96.25% using transfer learning, and 99.25% using double-transfer learning." @default.
- W3148908072 created "2021-04-13" @default.
- W3148908072 creator A5004901591 @default.
- W3148908072 creator A5008534489 @default.
- W3148908072 creator A5010765865 @default.
- W3148908072 creator A5016153721 @default.
- W3148908072 creator A5029299521 @default.
- W3148908072 creator A5031986832 @default.
- W3148908072 creator A5036536042 @default.
- W3148908072 creator A5084844724 @default.
- W3148908072 creator A5090966997 @default.
- W3148908072 date "2021-03-30" @default.
- W3148908072 modified "2023-10-10" @default.
- W3148908072 title "Novel Transfer Learning Approach for Medical Imaging with Limited Labeled Data" @default.
- W3148908072 cites W1996828958 @default.
- W3148908072 cites W2002507614 @default.
- W3148908072 cites W2118023920 @default.
- W3148908072 cites W2248620004 @default.
- W3148908072 cites W2344480160 @default.
- W3148908072 cites W2557738935 @default.
- W3148908072 cites W2559090303 @default.
- W3148908072 cites W2581082771 @default.
- W3148908072 cites W2582187633 @default.
- W3148908072 cites W2607363228 @default.
- W3148908072 cites W2620578070 @default.
- W3148908072 cites W2787196779 @default.
- W3148908072 cites W2885824038 @default.
- W3148908072 cites W2890183881 @default.
- W3148908072 cites W2919115771 @default.
- W3148908072 cites W2919358988 @default.
- W3148908072 cites W2928842276 @default.
- W3148908072 cites W2942231644 @default.
- W3148908072 cites W2947272707 @default.
- W3148908072 cites W2954074628 @default.
- W3148908072 cites W2963466845 @default.
- W3148908072 cites W2964274014 @default.
- W3148908072 cites W2969790209 @default.
- W3148908072 cites W3003416411 @default.
- W3148908072 cites W3006054280 @default.
- W3148908072 cites W3009210879 @default.
- W3148908072 cites W3010540599 @default.
- W3148908072 cites W3036935029 @default.
- W3148908072 cites W3037294774 @default.
- W3148908072 cites W3038835370 @default.
- W3148908072 cites W3084438349 @default.
- W3148908072 cites W3100321043 @default.
- W3148908072 cites W3102785203 @default.
- W3148908072 cites W3105070630 @default.
- W3148908072 cites W3110223342 @default.
- W3148908072 cites W3113056502 @default.
- W3148908072 cites W3118596356 @default.
- W3148908072 cites W3120518903 @default.
- W3148908072 cites W3120748058 @default.
- W3148908072 cites W3121263745 @default.
- W3148908072 cites W3124465767 @default.
- W3148908072 cites W3126423192 @default.
- W3148908072 cites W3128259432 @default.
- W3148908072 doi "https://doi.org/10.3390/cancers13071590" @default.
- W3148908072 hasPubMedCentralId "https://www.ncbi.nlm.nih.gov/pmc/articles/8036379" @default.
- W3148908072 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/33808207" @default.
- W3148908072 hasPublicationYear "2021" @default.
- W3148908072 type Work @default.
- W3148908072 sameAs 3148908072 @default.
- W3148908072 citedByCount "106" @default.
- W3148908072 countsByYear W31489080722021 @default.
- W3148908072 countsByYear W31489080722022 @default.
- W3148908072 countsByYear W31489080722023 @default.
- W3148908072 crossrefType "journal-article" @default.
- W3148908072 hasAuthorship W3148908072A5004901591 @default.
- W3148908072 hasAuthorship W3148908072A5008534489 @default.
- W3148908072 hasAuthorship W3148908072A5010765865 @default.
- W3148908072 hasAuthorship W3148908072A5016153721 @default.
- W3148908072 hasAuthorship W3148908072A5029299521 @default.
- W3148908072 hasAuthorship W3148908072A5031986832 @default.
- W3148908072 hasAuthorship W3148908072A5036536042 @default.
- W3148908072 hasAuthorship W3148908072A5084844724 @default.
- W3148908072 hasAuthorship W3148908072A5090966997 @default.
- W3148908072 hasBestOaLocation W31489080721 @default.
- W3148908072 hasConcept C108583219 @default.
- W3148908072 hasConcept C111919701 @default.
- W3148908072 hasConcept C115961682 @default.
- W3148908072 hasConcept C119857082 @default.
- W3148908072 hasConcept C150899416 @default.
- W3148908072 hasConcept C153180895 @default.
- W3148908072 hasConcept C154945302 @default.
- W3148908072 hasConcept C162324750 @default.
- W3148908072 hasConcept C187736073 @default.
- W3148908072 hasConcept C202444582 @default.
- W3148908072 hasConcept C2776321320 @default.
- W3148908072 hasConcept C2780451532 @default.
- W3148908072 hasConcept C31601959 @default.
- W3148908072 hasConcept C33923547 @default.
- W3148908072 hasConcept C41008148 @default.
- W3148908072 hasConcept C75294576 @default.
- W3148908072 hasConcept C81363708 @default.
- W3148908072 hasConcept C9652623 @default.
- W3148908072 hasConcept C98045186 @default.
- W3148908072 hasConceptScore W3148908072C108583219 @default.