Matches in SemOpenAlex for { <https://semopenalex.org/work/W2894944248> ?p ?o ?g. }
- W2894944248 abstract "<ns7:p>In real-world scenario, image classification models degrade in performance as the images are corrupted with noise, while these models are trained with preprocessed data. Although deep neural networks (DNNs) are found efficient for image classification due to their deep layer-wise design to emulate latent features from data, they suffer from the same noise issue. Noise in image is common phenomena in real life scenarios and a number of studies have been conducted in the previous couple of decades with the intention to overcome the effect of noise in the image data. The aim of this study was to investigate the DNN-based better noisy image classification system. At first, the autoencoder (AE)-based denoising techniques were considered to reconstruct native image from the input noisy image. Then, convolutional neural network (CNN) is employed to classify the reconstructed image; as CNN was a prominent DNN method with the ability to preserve better representation of the internal structure of the image data. In the denoising step, a variety of existing AEs, named denoising autoencoder (DAE), convolutional denoising autoencoder (CDAE) and denoising variational autoencoder (DVAE) as well as two hybrid AEs (DAE-CDAE and DVAE- CDAE) were used. Therefore, this study considered five hybrid models for noisy image classification termed as: DAE-CNN, CDAE-CNN, DVAE-CNN, DAE-CDAE-CNN and DVAE- CDAE-CNN. The proposed hybrid classifiers were validated by experimenting over two benchmark datasets (i.e. MNIST and CIFAR-10) after corrupting them with noises of various proportions. These methods outperformed some of the existing eminent methods attaining satisfactory recognition accuracy even when the images were corrupted with 50% noise though these models were trained with 20% noise in the image. Among the proposed methods, DVAE-CDAE-CNN was found to be better than the others while classifying massive noisy images, and DVAE-CNN was the most appropriate for regular noise. The main significance of this work is the employment of the hybrid model with the complementary strengths of AEs and CNN in noisy image classification. AEs in the hybrid models enhanced the proficiency of CNN to classify highly noisy data even though trained with low level noise. </ns7:p>" @default.
- W2894944248 created "2018-10-12" @default.
- W2894944248 creator A5018961302 @default.
- W2894944248 creator A5043721074 @default.
- W2894944248 creator A5050423427 @default.
- W2894944248 date "2018-03-28" @default.
- W2894944248 modified "2023-10-16" @default.
- W2894944248 title "NOISY IMAGE CLASSIFICATION USING HYBRID DEEP LEARNING METHODS" @default.
- W2894944248 cites W1499798934 @default.
- W2894944248 cites W1523493493 @default.
- W2894944248 cites W2005876975 @default.
- W2894944248 cites W2015513598 @default.
- W2894944248 cites W2017257315 @default.
- W2894944248 cites W2025768430 @default.
- W2894944248 cites W2036109700 @default.
- W2894944248 cites W2037642501 @default.
- W2894944248 cites W2062118960 @default.
- W2894944248 cites W2072128103 @default.
- W2894944248 cites W2076063813 @default.
- W2894944248 cites W2084220915 @default.
- W2894944248 cites W2098477387 @default.
- W2894944248 cites W2105464873 @default.
- W2894944248 cites W2112796928 @default.
- W2894944248 cites W2118858186 @default.
- W2894944248 cites W2119821739 @default.
- W2894944248 cites W2124964692 @default.
- W2894944248 cites W2130604180 @default.
- W2894944248 cites W2136655611 @default.
- W2894944248 cites W2136922672 @default.
- W2894944248 cites W2141200610 @default.
- W2894944248 cites W2145094598 @default.
- W2894944248 cites W2146337213 @default.
- W2894944248 cites W2150134853 @default.
- W2894944248 cites W2151503710 @default.
- W2894944248 cites W2152417180 @default.
- W2894944248 cites W2153663612 @default.
- W2894944248 cites W2154683974 @default.
- W2894944248 cites W2156387975 @default.
- W2894944248 cites W2165146474 @default.
- W2894944248 cites W2165720259 @default.
- W2894944248 cites W2168809519 @default.
- W2894944248 cites W2168893862 @default.
- W2894944248 cites W2169805405 @default.
- W2894944248 cites W2293078015 @default.
- W2894944248 cites W2321627895 @default.
- W2894944248 cites W2510850936 @default.
- W2894944248 cites W2525737665 @default.
- W2894944248 cites W2739457413 @default.
- W2894944248 cites W2766736793 @default.
- W2894944248 cites W2963501406 @default.
- W2894944248 cites W2964153729 @default.
- W2894944248 cites W59771946 @default.
- W2894944248 doi "https://doi.org/10.32890/jict2018.17.2.8253" @default.
- W2894944248 hasPublicationYear "2018" @default.
- W2894944248 type Work @default.
- W2894944248 sameAs 2894944248 @default.
- W2894944248 citedByCount "5" @default.
- W2894944248 countsByYear W28949442482019 @default.
- W2894944248 countsByYear W28949442482020 @default.
- W2894944248 countsByYear W28949442482022 @default.
- W2894944248 countsByYear W28949442482023 @default.
- W2894944248 crossrefType "journal-article" @default.
- W2894944248 hasAuthorship W2894944248A5018961302 @default.
- W2894944248 hasAuthorship W2894944248A5043721074 @default.
- W2894944248 hasAuthorship W2894944248A5050423427 @default.
- W2894944248 hasBestOaLocation W28949442481 @default.
- W2894944248 hasConcept C101738243 @default.
- W2894944248 hasConcept C108583219 @default.
- W2894944248 hasConcept C115961682 @default.
- W2894944248 hasConcept C13280743 @default.
- W2894944248 hasConcept C153180895 @default.
- W2894944248 hasConcept C154945302 @default.
- W2894944248 hasConcept C163294075 @default.
- W2894944248 hasConcept C185798385 @default.
- W2894944248 hasConcept C190502265 @default.
- W2894944248 hasConcept C205649164 @default.
- W2894944248 hasConcept C41008148 @default.
- W2894944248 hasConcept C75294576 @default.
- W2894944248 hasConcept C81363708 @default.
- W2894944248 hasConcept C99498987 @default.
- W2894944248 hasConceptScore W2894944248C101738243 @default.
- W2894944248 hasConceptScore W2894944248C108583219 @default.
- W2894944248 hasConceptScore W2894944248C115961682 @default.
- W2894944248 hasConceptScore W2894944248C13280743 @default.
- W2894944248 hasConceptScore W2894944248C153180895 @default.
- W2894944248 hasConceptScore W2894944248C154945302 @default.
- W2894944248 hasConceptScore W2894944248C163294075 @default.
- W2894944248 hasConceptScore W2894944248C185798385 @default.
- W2894944248 hasConceptScore W2894944248C190502265 @default.
- W2894944248 hasConceptScore W2894944248C205649164 @default.
- W2894944248 hasConceptScore W2894944248C41008148 @default.
- W2894944248 hasConceptScore W2894944248C75294576 @default.
- W2894944248 hasConceptScore W2894944248C81363708 @default.
- W2894944248 hasConceptScore W2894944248C99498987 @default.
- W2894944248 hasLocation W28949442481 @default.
- W2894944248 hasOpenAccess W2894944248 @default.
- W2894944248 hasPrimaryLocation W28949442481 @default.
- W2894944248 hasRelatedWork W2732542196 @default.
- W2894944248 hasRelatedWork W2904927891 @default.
- W2894944248 hasRelatedWork W2947175736 @default.