Matches in SemOpenAlex for { <https://semopenalex.org/work/W2095705004> ?p ?o ?g. }
- W2095705004 endingPage "1958" @default.
- W2095705004 startingPage "1929" @default.
- W2095705004 abstract "Deep neural nets with a large number of parameters are very powerful machine learning systems. However, overfitting is a serious problem in such networks. Large networks are also slow to use, making it difficult to deal with overfitting by combining the predictions of many different large neural nets at test time. Dropout is a technique for addressing this problem. The key idea is to randomly drop units (along with their connections) from the neural network during training. This prevents units from co-adapting too much. During training, dropout samples from an exponential number of different networks. At test time, it is easy to approximate the effect of averaging the predictions of all these thinned networks by simply using a single unthinned network that has smaller weights. This significantly reduces overfitting and gives major improvements over other regularization methods. We show that dropout improves the performance of neural networks on supervised learning tasks in vision, speech recognition, document classification and computational biology, obtaining state-of-the-art results on many benchmark data sets." @default.
- W2095705004 created "2016-06-24" @default.
- W2095705004 creator A5006446297 @default.
- W2095705004 creator A5024209719 @default.
- W2095705004 creator A5031152245 @default.
- W2095705004 creator A5057849732 @default.
- W2095705004 creator A5071983998 @default.
- W2095705004 date "2014-01-01" @default.
- W2095705004 modified "2023-10-11" @default.
- W2095705004 title "Dropout: a simple way to prevent neural networks from overfitting" @default.
- W2095705004 cites W137106866 @default.
- W2095705004 cites W1492459858 @default.
- W2095705004 cites W1524333225 @default.
- W2095705004 cites W1567512734 @default.
- W2095705004 cites W189596042 @default.
- W2095705004 cites W1993882792 @default.
- W2095705004 cites W2025768430 @default.
- W2095705004 cites W2053229256 @default.
- W2095705004 cites W2085040216 @default.
- W2095705004 cites W2096873754 @default.
- W2095705004 cites W2100495367 @default.
- W2095705004 cites W2103359087 @default.
- W2095705004 cites W2114296159 @default.
- W2095705004 cites W2114733238 @default.
- W2095705004 cites W2131241448 @default.
- W2095705004 cites W2135046866 @default.
- W2095705004 cites W2136922672 @default.
- W2095705004 cites W2145094598 @default.
- W2095705004 cites W2147800946 @default.
- W2095705004 cites W2150717117 @default.
- W2095705004 cites W2152722485 @default.
- W2095705004 cites W2156163116 @default.
- W2095705004 cites W2156297475 @default.
- W2095705004 cites W2158542502 @default.
- W2095705004 cites W2163605009 @default.
- W2095705004 cites W2183112036 @default.
- W2095705004 cites W2294059674 @default.
- W2095705004 cites W2335728318 @default.
- W2095705004 cites W2546302380 @default.
- W2095705004 cites W2611675901 @default.
- W2095705004 cites W2949821452 @default.
- W2095705004 cites W2962820688 @default.
- W2095705004 cites W2963574257 @default.
- W2095705004 cites W2971788173 @default.
- W2095705004 cites W3118608800 @default.
- W2095705004 cites W35527955 @default.
- W2095705004 hasPublicationYear "2014" @default.
- W2095705004 type Work @default.
- W2095705004 sameAs 2095705004 @default.
- W2095705004 citedByCount "8431" @default.
- W2095705004 countsByYear W20957050042014 @default.
- W2095705004 countsByYear W20957050042015 @default.
- W2095705004 countsByYear W20957050042016 @default.
- W2095705004 countsByYear W20957050042017 @default.
- W2095705004 countsByYear W20957050042018 @default.
- W2095705004 countsByYear W20957050042019 @default.
- W2095705004 countsByYear W20957050042020 @default.
- W2095705004 countsByYear W20957050042021 @default.
- W2095705004 countsByYear W20957050042022 @default.
- W2095705004 countsByYear W20957050042023 @default.
- W2095705004 crossrefType "journal-article" @default.
- W2095705004 hasAuthorship W2095705004A5006446297 @default.
- W2095705004 hasAuthorship W2095705004A5024209719 @default.
- W2095705004 hasAuthorship W2095705004A5031152245 @default.
- W2095705004 hasAuthorship W2095705004A5057849732 @default.
- W2095705004 hasAuthorship W2095705004A5071983998 @default.
- W2095705004 hasConcept C119857082 @default.
- W2095705004 hasConcept C13280743 @default.
- W2095705004 hasConcept C154945302 @default.
- W2095705004 hasConcept C185798385 @default.
- W2095705004 hasConcept C205649164 @default.
- W2095705004 hasConcept C22019652 @default.
- W2095705004 hasConcept C2776135515 @default.
- W2095705004 hasConcept C2776145597 @default.
- W2095705004 hasConcept C2984842247 @default.
- W2095705004 hasConcept C41008148 @default.
- W2095705004 hasConcept C50644808 @default.
- W2095705004 hasConceptScore W2095705004C119857082 @default.
- W2095705004 hasConceptScore W2095705004C13280743 @default.
- W2095705004 hasConceptScore W2095705004C154945302 @default.
- W2095705004 hasConceptScore W2095705004C185798385 @default.
- W2095705004 hasConceptScore W2095705004C205649164 @default.
- W2095705004 hasConceptScore W2095705004C22019652 @default.
- W2095705004 hasConceptScore W2095705004C2776135515 @default.
- W2095705004 hasConceptScore W2095705004C2776145597 @default.
- W2095705004 hasConceptScore W2095705004C2984842247 @default.
- W2095705004 hasConceptScore W2095705004C41008148 @default.
- W2095705004 hasConceptScore W2095705004C50644808 @default.
- W2095705004 hasIssue "1" @default.
- W2095705004 hasLocation W20957050041 @default.
- W2095705004 hasOpenAccess W2095705004 @default.
- W2095705004 hasPrimaryLocation W20957050041 @default.
- W2095705004 hasRelatedWork W1533861849 @default.
- W2095705004 hasRelatedWork W1665214252 @default.
- W2095705004 hasRelatedWork W1677182931 @default.
- W2095705004 hasRelatedWork W1686810756 @default.
- W2095705004 hasRelatedWork W1901129140 @default.
- W2095705004 hasRelatedWork W2064675550 @default.