Matches in SemOpenAlex for { <https://semopenalex.org/work/W3166609963> ?p ?o ?g. }
- W3166609963 abstract "Large width limits have been a recent focus of deep learning research: modulo computational practicalities, do wider networks outperform narrower ones? Answering this question has been challenging, as conventional networks gain representational power with width, potentially masking any negative effects. Our analysis in this paper decouples capacity and width via the generalization of neural networks to Deep Gaussian Processes (Deep GP), a class of hierarchical models that subsume neural nets. In doing so, we aim to understand how width affects standard neural networks once they have sufficient capacity for a given modeling task. Our theoretical and empirical results on Deep GP suggest that large width is generally detrimental to hierarchical models. Surprisingly, we prove that even nonparametric Deep GP converge to Gaussian processes, effectively becoming shallower without any increase in representational power. The posterior, which corresponds to a mixture of data-adaptable basis functions, becomes less data-dependent with width. Our tail analysis demonstrates that width and depth have opposite effects: depth accentuates a model's non-Gaussianity, while width makes models increasingly Gaussian. We find there is a sweet spot that maximizes test set performance before the limiting GP behavior prevents adaptability, occurring at width = 1 or width = 2 for nonparametric Deep GP. These results make strong predictions about the same phenomenon in conventional neural networks: we show empirically that many neural network architectures need 10 - 500 hidden units for sufficient capacity - depending on the dataset - but further width degrades test performance." @default.
- W3166609963 created "2021-06-22" @default.
- W3166609963 creator A5030697996 @default.
- W3166609963 creator A5065203625 @default.
- W3166609963 date "2021-06-11" @default.
- W3166609963 modified "2023-09-27" @default.
- W3166609963 title "The Limitations of Large Width in Neural Networks: A Deep Gaussian Process Perspective" @default.
- W3166609963 cites W1172736100 @default.
- W3166609963 cites W1567012231 @default.
- W3166609963 cites W1567512734 @default.
- W3166609963 cites W1589901016 @default.
- W3166609963 cites W1663973292 @default.
- W3166609963 cites W1746819321 @default.
- W3166609963 cites W2082100828 @default.
- W3166609963 cites W2112796928 @default.
- W3166609963 cites W2144902422 @default.
- W3166609963 cites W2161388792 @default.
- W3166609963 cites W2164411961 @default.
- W3166609963 cites W2167608136 @default.
- W3166609963 cites W2186535340 @default.
- W3166609963 cites W2194775991 @default.
- W3166609963 cites W2257113116 @default.
- W3166609963 cites W2302053044 @default.
- W3166609963 cites W2390241580 @default.
- W3166609963 cites W2557283755 @default.
- W3166609963 cites W2728139190 @default.
- W3166609963 cites W2802739963 @default.
- W3166609963 cites W2804589149 @default.
- W3166609963 cites W2809090039 @default.
- W3166609963 cites W2910655610 @default.
- W3166609963 cites W2913243980 @default.
- W3166609963 cites W2943838153 @default.
- W3166609963 cites W2953263857 @default.
- W3166609963 cites W2962685794 @default.
- W3166609963 cites W2962698540 @default.
- W3166609963 cites W2962779017 @default.
- W3166609963 cites W2962875063 @default.
- W3166609963 cites W2962933129 @default.
- W3166609963 cites W2962990163 @default.
- W3166609963 cites W2963095610 @default.
- W3166609963 cites W2963097630 @default.
- W3166609963 cites W2963190151 @default.
- W3166609963 cites W2963239103 @default.
- W3166609963 cites W2963323437 @default.
- W3166609963 cites W2963417959 @default.
- W3166609963 cites W2963427613 @default.
- W3166609963 cites W2963518130 @default.
- W3166609963 cites W2963711523 @default.
- W3166609963 cites W2963935178 @default.
- W3166609963 cites W2963977107 @default.
- W3166609963 cites W2963982496 @default.
- W3166609963 cites W2964052793 @default.
- W3166609963 cites W2964059111 @default.
- W3166609963 cites W2964088238 @default.
- W3166609963 cites W2964118293 @default.
- W3166609963 cites W2964121744 @default.
- W3166609963 cites W2964137095 @default.
- W3166609963 cites W2964141597 @default.
- W3166609963 cites W2964321317 @default.
- W3166609963 cites W2970330753 @default.
- W3166609963 cites W2970332347 @default.
- W3166609963 cites W2970618525 @default.
- W3166609963 cites W2970723196 @default.
- W3166609963 cites W2970971581 @default.
- W3166609963 cites W2971043187 @default.
- W3166609963 cites W2971169274 @default.
- W3166609963 cites W2994872659 @default.
- W3166609963 cites W2995015865 @default.
- W3166609963 cites W2996603747 @default.
- W3166609963 cites W3000127803 @default.
- W3166609963 cites W3010154184 @default.
- W3166609963 cites W3031355122 @default.
- W3166609963 cites W3034560374 @default.
- W3166609963 cites W3034979923 @default.
- W3166609963 cites W3035433747 @default.
- W3166609963 cites W3035679810 @default.
- W3166609963 cites W3036525728 @default.
- W3166609963 cites W3037003270 @default.
- W3166609963 cites W3037333676 @default.
- W3166609963 cites W3038074040 @default.
- W3166609963 cites W3046749015 @default.
- W3166609963 cites W3086778125 @default.
- W3166609963 cites W3098527778 @default.
- W3166609963 cites W3101069636 @default.
- W3166609963 cites W3101581426 @default.
- W3166609963 cites W3101787089 @default.
- W3166609963 cites W3104962057 @default.
- W3166609963 cites W3118608800 @default.
- W3166609963 cites W3120740533 @default.
- W3166609963 cites W3148924112 @default.
- W3166609963 cites W3152877612 @default.
- W3166609963 cites W3157538235 @default.
- W3166609963 cites W3167104264 @default.
- W3166609963 cites W3168814424 @default.
- W3166609963 cites W3171719115 @default.
- W3166609963 cites W3172914211 @default.
- W3166609963 cites W3180313059 @default.
- W3166609963 cites W3206494906 @default.
- W3166609963 cites W3214013345 @default.
- W3166609963 hasPublicationYear "2021" @default.