Matches in SemOpenAlex for { <https://semopenalex.org/work/W3092183036> ?p ?o ?g. }
- W3092183036 abstract "This paper introduces Multi-Level feature learning alongside the Embedding layer of Convolutional Autoencoder (CAE-MLE) as a novel approach in deep clustering. We use agglomerative clustering as the multi-level feature learning that provides a hierarchical structure on the latent feature space. It is shown that applying multi-level feature learning considerably improves the basic deep convolutional embedding clustering (DCEC). CAE-MLE considers the clustering loss of agglomerative clustering simultaneously alongside the learning latent feature of CAE. In the following of the previous works in inverse feature learning, we show that the representation of learning of error as a general strategy can be applied on different deep clustering approaches and it leads to promising results. We develop deep inverse feature learning (deep IFL) on CAE-MLE as a novel approach that leads to the state-of-the-art results among the same category methods. The experimental results show that the CAE-MLE improves the results of the basic method, DCEC, around 7% -14% on two well-known datasets of MNIST and USPS. Also, it is shown that the proposed deep IFL improves the primary results about 9%-17%. Therefore, both proposed approaches of CAE-MLE and deep IFL based on CAE-MLE can lead to notable performance improvement in comparison to the majority of existing techniques. The proposed approaches while are based on a basic convolutional autoencoder lead to outstanding results even in comparison to variational autoencoders or generative adversarial networks." @default.
- W3092183036 created "2020-10-15" @default.
- W3092183036 creator A5035395012 @default.
- W3092183036 creator A5042116950 @default.
- W3092183036 date "2020-10-05" @default.
- W3092183036 modified "2023-09-27" @default.
- W3092183036 title "Multi-level Feature Learning on Embedding Layer of Convolutional Autoencoders and Deep Inverse Feature Learning for Image Clustering" @default.
- W3092183036 cites W2016381774 @default.
- W3092183036 cites W2020735245 @default.
- W3092183036 cites W2051549110 @default.
- W3092183036 cites W2063123441 @default.
- W3092183036 cites W2097922870 @default.
- W3092183036 cites W2100495367 @default.
- W3092183036 cites W2110798204 @default.
- W3092183036 cites W2112796928 @default.
- W3092183036 cites W2129793592 @default.
- W3092183036 cites W2131828344 @default.
- W3092183036 cites W2145094598 @default.
- W3092183036 cites W2152322845 @default.
- W3092183036 cites W2156483112 @default.
- W3092183036 cites W2164136210 @default.
- W3092183036 cites W2173520492 @default.
- W3092183036 cites W2176950688 @default.
- W3092183036 cites W2178768799 @default.
- W3092183036 cites W2187089797 @default.
- W3092183036 cites W2293078015 @default.
- W3092183036 cites W2533545350 @default.
- W3092183036 cites W2556467266 @default.
- W3092183036 cites W2593814746 @default.
- W3092183036 cites W2603986758 @default.
- W3092183036 cites W2608862709 @default.
- W3092183036 cites W2730106296 @default.
- W3092183036 cites W2741943936 @default.
- W3092183036 cites W2750102740 @default.
- W3092183036 cites W2765741717 @default.
- W3092183036 cites W2777922598 @default.
- W3092183036 cites W2779692282 @default.
- W3092183036 cites W2781711557 @default.
- W3092183036 cites W2784962210 @default.
- W3092183036 cites W2786958346 @default.
- W3092183036 cites W2803296075 @default.
- W3092183036 cites W2884851420 @default.
- W3092183036 cites W2886255780 @default.
- W3092183036 cites W2886643713 @default.
- W3092183036 cites W2887147027 @default.
- W3092183036 cites W2890018624 @default.
- W3092183036 cites W2896971048 @default.
- W3092183036 cites W2902930721 @default.
- W3092183036 cites W2903328039 @default.
- W3092183036 cites W2911208170 @default.
- W3092183036 cites W2913341171 @default.
- W3092183036 cites W2948684212 @default.
- W3092183036 cites W2951004968 @default.
- W3092183036 cites W2952508194 @default.
- W3092183036 cites W2953791858 @default.
- W3092183036 cites W2955750207 @default.
- W3092183036 cites W2962852342 @default.
- W3092183036 cites W2962997960 @default.
- W3092183036 cites W2963226019 @default.
- W3092183036 cites W2963761396 @default.
- W3092183036 cites W2964074409 @default.
- W3092183036 cites W2964118618 @default.
- W3092183036 cites W2967433321 @default.
- W3092183036 cites W2967973127 @default.
- W3092183036 cites W2972333410 @default.
- W3092183036 cites W2986063762 @default.
- W3092183036 cites W2993554215 @default.
- W3092183036 cites W2999320175 @default.
- W3092183036 cites W3003813325 @default.
- W3092183036 cites W3010583870 @default.
- W3092183036 cites W3015318159 @default.
- W3092183036 cites W3015476709 @default.
- W3092183036 cites W3015507004 @default.
- W3092183036 cites W3025791769 @default.
- W3092183036 cites W3042468733 @default.
- W3092183036 cites W3101709902 @default.
- W3092183036 cites W630242894 @default.
- W3092183036 cites W82771173 @default.
- W3092183036 hasPublicationYear "2020" @default.
- W3092183036 type Work @default.
- W3092183036 sameAs 3092183036 @default.
- W3092183036 citedByCount "0" @default.
- W3092183036 crossrefType "posted-content" @default.
- W3092183036 hasAuthorship W3092183036A5035395012 @default.
- W3092183036 hasAuthorship W3092183036A5042116950 @default.
- W3092183036 hasConcept C101738243 @default.
- W3092183036 hasConcept C108583219 @default.
- W3092183036 hasConcept C119857082 @default.
- W3092183036 hasConcept C138885662 @default.
- W3092183036 hasConcept C153180895 @default.
- W3092183036 hasConcept C154945302 @default.
- W3092183036 hasConcept C190502265 @default.
- W3092183036 hasConcept C2776401178 @default.
- W3092183036 hasConcept C41008148 @default.
- W3092183036 hasConcept C41608201 @default.
- W3092183036 hasConcept C41895202 @default.
- W3092183036 hasConcept C59404180 @default.
- W3092183036 hasConcept C73555534 @default.
- W3092183036 hasConcept C81363708 @default.
- W3092183036 hasConceptScore W3092183036C101738243 @default.