Matches in SemOpenAlex for { <https://semopenalex.org/work/W2174554700> ?p ?o ?g. }
- W2174554700 endingPage "2376" @default.
- W2174554700 startingPage "2368" @default.
- W2174554700 abstract "The Ladder Network is a recent new approach to semi-supervised learning that turned out to be very successful. While showing impressive performance, the Ladder Network has many components intertwined, whose contributions are not obvious in such a complex architecture. This paper presents an extensive experimental investigation of variants of the Ladder Network in which we replaced or removed individual components to learn about their relative importance. For semi-supervised tasks, we conclude that the most important contribution is made by the lateral connections, followed by the application of noise, and the choice of what we refer to as the 'combinator function'. As the number of labeled training examples increases, the lateral connections and the reconstruction criterion become less important, with most of the generalization improvement coming from the injection of noise in each layer. Finally, we introduce a combinator function that reduces test error rates on Permutation-Invariant MNIST to 0.57% for the supervised setting, and to 0.97% and 1.0% for semi-supervised settings with 1000 and 100 labeled examples, respectively." @default.
- W2174554700 created "2016-06-24" @default.
- W2174554700 creator A5007854188 @default.
- W2174554700 creator A5031185465 @default.
- W2174554700 creator A5047130674 @default.
- W2174554700 creator A5054905472 @default.
- W2174554700 creator A5086198262 @default.
- W2174554700 date "2016-06-19" @default.
- W2174554700 modified "2023-10-01" @default.
- W2174554700 title "Deconstructing the ladder network architecture" @default.
- W2174554700 cites W1026270304 @default.
- W2174554700 cites W1532854728 @default.
- W2174554700 cites W1606458877 @default.
- W2174554700 cites W1904365287 @default.
- W2174554700 cites W1964155876 @default.
- W2174554700 cites W2025768430 @default.
- W2174554700 cites W2095705004 @default.
- W2174554700 cites W2097998348 @default.
- W2174554700 cites W2108501770 @default.
- W2174554700 cites W2108677974 @default.
- W2174554700 cites W2134842679 @default.
- W2174554700 cites W2136922672 @default.
- W2174554700 cites W2145094598 @default.
- W2174554700 cites W2147062276 @default.
- W2174554700 cites W2162262658 @default.
- W2174554700 cites W2194775991 @default.
- W2174554700 cites W2949117887 @default.
- W2174554700 cites W2949416428 @default.
- W2174554700 cites W2962897886 @default.
- W2174554700 cites W2964121744 @default.
- W2174554700 cites W2964199361 @default.
- W2174554700 cites W2964300310 @default.
- W2174554700 cites W830076066 @default.
- W2174554700 hasPublicationYear "2016" @default.
- W2174554700 type Work @default.
- W2174554700 sameAs 2174554700 @default.
- W2174554700 citedByCount "23" @default.
- W2174554700 countsByYear W21745547002016 @default.
- W2174554700 countsByYear W21745547002017 @default.
- W2174554700 countsByYear W21745547002018 @default.
- W2174554700 countsByYear W21745547002019 @default.
- W2174554700 countsByYear W21745547002020 @default.
- W2174554700 countsByYear W21745547002021 @default.
- W2174554700 crossrefType "proceedings-article" @default.
- W2174554700 hasAuthorship W2174554700A5007854188 @default.
- W2174554700 hasAuthorship W2174554700A5031185465 @default.
- W2174554700 hasAuthorship W2174554700A5047130674 @default.
- W2174554700 hasAuthorship W2174554700A5054905472 @default.
- W2174554700 hasAuthorship W2174554700A5086198262 @default.
- W2174554700 hasConcept C108583219 @default.
- W2174554700 hasConcept C11413529 @default.
- W2174554700 hasConcept C115961682 @default.
- W2174554700 hasConcept C119857082 @default.
- W2174554700 hasConcept C121332964 @default.
- W2174554700 hasConcept C123657996 @default.
- W2174554700 hasConcept C134306372 @default.
- W2174554700 hasConcept C136389625 @default.
- W2174554700 hasConcept C14036430 @default.
- W2174554700 hasConcept C142362112 @default.
- W2174554700 hasConcept C153349607 @default.
- W2174554700 hasConcept C154945302 @default.
- W2174554700 hasConcept C177148314 @default.
- W2174554700 hasConcept C190470478 @default.
- W2174554700 hasConcept C190502265 @default.
- W2174554700 hasConcept C193415008 @default.
- W2174554700 hasConcept C199360897 @default.
- W2174554700 hasConcept C21308566 @default.
- W2174554700 hasConcept C24890656 @default.
- W2174554700 hasConcept C33923547 @default.
- W2174554700 hasConcept C37914503 @default.
- W2174554700 hasConcept C38652104 @default.
- W2174554700 hasConcept C41008148 @default.
- W2174554700 hasConcept C50644808 @default.
- W2174554700 hasConcept C78458016 @default.
- W2174554700 hasConcept C79678938 @default.
- W2174554700 hasConcept C80444323 @default.
- W2174554700 hasConcept C86803240 @default.
- W2174554700 hasConcept C99498987 @default.
- W2174554700 hasConceptScore W2174554700C108583219 @default.
- W2174554700 hasConceptScore W2174554700C11413529 @default.
- W2174554700 hasConceptScore W2174554700C115961682 @default.
- W2174554700 hasConceptScore W2174554700C119857082 @default.
- W2174554700 hasConceptScore W2174554700C121332964 @default.
- W2174554700 hasConceptScore W2174554700C123657996 @default.
- W2174554700 hasConceptScore W2174554700C134306372 @default.
- W2174554700 hasConceptScore W2174554700C136389625 @default.
- W2174554700 hasConceptScore W2174554700C14036430 @default.
- W2174554700 hasConceptScore W2174554700C142362112 @default.
- W2174554700 hasConceptScore W2174554700C153349607 @default.
- W2174554700 hasConceptScore W2174554700C154945302 @default.
- W2174554700 hasConceptScore W2174554700C177148314 @default.
- W2174554700 hasConceptScore W2174554700C190470478 @default.
- W2174554700 hasConceptScore W2174554700C190502265 @default.
- W2174554700 hasConceptScore W2174554700C193415008 @default.
- W2174554700 hasConceptScore W2174554700C199360897 @default.
- W2174554700 hasConceptScore W2174554700C21308566 @default.
- W2174554700 hasConceptScore W2174554700C24890656 @default.
- W2174554700 hasConceptScore W2174554700C33923547 @default.