Matches in SemOpenAlex for { <https://semopenalex.org/work/W4306294877> ?p ?o ?g. }
- W4306294877 endingPage "24" @default.
- W4306294877 startingPage "11" @default.
- W4306294877 abstract "Content-based image retrieval (CBIR) is a recent method used to retrieve different types of images from repositories. The traditional content-based medical image retrieval (CBMIR) methods commonly used low-level image representation features extracted from color, texture, and shape image descriptors. Since most of these CBMIR systems depend mainly on the extracted features, the methods used in the feature extraction phase are more important. Features extraction methods, which generate inaccurate features, lead to very poor performance retrieval because of semantic gap widening. Hence, there is high demand for independent domain knowledge features extraction methods, which have automatic learning capabilities from input images. Pre-trained deep convolution neural networks (CNNs), the recent generation of deep learning neural networks, could be used to extract expressive and accurate features. The main advantage of these pre-trained CNNs models is the pre-training process for huge image data of thousands of different classes, and their knowledge after the training process could easily be transferred. There are many successful models of pre-trained CNNs models used in the area of medical image retrieval, image classification, and object recognition. This study utilizes two of the most known pre-trained CNNs models; ResNet18 and SqueezeNet for the offline feature extraction stage. Additionally, the highly accurate features extracted from medical images are used for the CBMIR method of medical image retrieval. This study uses two popular medical image datasets; Kvasir and PH2 to show that the proposed methods have good retrieval results. The retrieval performance evaluation measures of our proposed method have average precision of 97.75% and 83.33% for Kvasir and PH2 medical images respectively, and outperform some of the state-of-the-art methods in this field of study because these pre-trained CNNs have well trained layers among a huge number of image types. Finally, intensive statistical analysis shows that the proposed ResNet18-based retrieval method has the best performance for enhancing both recall and precision measures for both medical images." @default.
- W4306294877 created "2022-10-15" @default.
- W4306294877 creator A5003556745 @default.
- W4306294877 creator A5061173802 @default.
- W4306294877 creator A5066654528 @default.
- W4306294877 date "2022-12-01" @default.
- W4306294877 modified "2023-09-22" @default.
- W4306294877 title "Pre-trained convolution neural networks models for content-based medical image retrieval" @default.
- W4306294877 cites W159379614 @default.
- W4306294877 cites W1596717185 @default.
- W4306294877 cites W1991084032 @default.
- W4306294877 cites W2016131368 @default.
- W4306294877 cites W2020018365 @default.
- W4306294877 cites W2034196113 @default.
- W4306294877 cites W2045947690 @default.
- W4306294877 cites W2046589280 @default.
- W4306294877 cites W2061253660 @default.
- W4306294877 cites W2068874033 @default.
- W4306294877 cites W2072372801 @default.
- W4306294877 cites W2076154873 @default.
- W4306294877 cites W2082453965 @default.
- W4306294877 cites W2097117768 @default.
- W4306294877 cites W2098371241 @default.
- W4306294877 cites W2101828407 @default.
- W4306294877 cites W2103519186 @default.
- W4306294877 cites W2108598243 @default.
- W4306294877 cites W2110112487 @default.
- W4306294877 cites W2113180829 @default.
- W4306294877 cites W2117395697 @default.
- W4306294877 cites W2122762031 @default.
- W4306294877 cites W2123229215 @default.
- W4306294877 cites W2125283600 @default.
- W4306294877 cites W2130660124 @default.
- W4306294877 cites W2133854877 @default.
- W4306294877 cites W2183341477 @default.
- W4306294877 cites W2194775991 @default.
- W4306294877 cites W2203738484 @default.
- W4306294877 cites W2221243399 @default.
- W4306294877 cites W2270192120 @default.
- W4306294877 cites W22818532 @default.
- W4306294877 cites W2508457857 @default.
- W4306294877 cites W2536971076 @default.
- W4306294877 cites W2538039981 @default.
- W4306294877 cites W2601707599 @default.
- W4306294877 cites W2623808523 @default.
- W4306294877 cites W2757416624 @default.
- W4306294877 cites W2757455114 @default.
- W4306294877 cites W2761488830 @default.
- W4306294877 cites W2810155558 @default.
- W4306294877 cites W2943858452 @default.
- W4306294877 cites W2964350391 @default.
- W4306294877 cites W2964816710 @default.
- W4306294877 cites W2967817062 @default.
- W4306294877 cites W2985328368 @default.
- W4306294877 cites W2995942064 @default.
- W4306294877 cites W3019531985 @default.
- W4306294877 cites W3023003654 @default.
- W4306294877 cites W3028231159 @default.
- W4306294877 cites W3032877681 @default.
- W4306294877 cites W3070218879 @default.
- W4306294877 cites W3080174621 @default.
- W4306294877 cites W3087143278 @default.
- W4306294877 cites W3091477051 @default.
- W4306294877 cites W3092842991 @default.
- W4306294877 cites W3095773643 @default.
- W4306294877 cites W3100449589 @default.
- W4306294877 cites W3100548413 @default.
- W4306294877 cites W3107979957 @default.
- W4306294877 cites W3126956426 @default.
- W4306294877 cites W3157328277 @default.
- W4306294877 cites W3163320234 @default.
- W4306294877 cites W3171169449 @default.
- W4306294877 cites W3176876845 @default.
- W4306294877 cites W3189808986 @default.
- W4306294877 cites W3198147788 @default.
- W4306294877 cites W3016166335 @default.
- W4306294877 doi "https://doi.org/10.21833/ijaas.2022.12.002" @default.
- W4306294877 hasPublicationYear "2022" @default.
- W4306294877 type Work @default.
- W4306294877 citedByCount "0" @default.
- W4306294877 crossrefType "journal-article" @default.
- W4306294877 hasAuthorship W4306294877A5003556745 @default.
- W4306294877 hasAuthorship W4306294877A5061173802 @default.
- W4306294877 hasAuthorship W4306294877A5066654528 @default.
- W4306294877 hasBestOaLocation W43062948771 @default.
- W4306294877 hasConcept C108583219 @default.
- W4306294877 hasConcept C111919701 @default.
- W4306294877 hasConcept C115961682 @default.
- W4306294877 hasConcept C138885662 @default.
- W4306294877 hasConcept C153180895 @default.
- W4306294877 hasConcept C154945302 @default.
- W4306294877 hasConcept C1667742 @default.
- W4306294877 hasConcept C199579030 @default.
- W4306294877 hasConcept C2776401178 @default.
- W4306294877 hasConcept C2780052074 @default.
- W4306294877 hasConcept C31972630 @default.
- W4306294877 hasConcept C41008148 @default.
- W4306294877 hasConcept C41895202 @default.