Matches in SemOpenAlex for { <https://semopenalex.org/work/W3088396174> ?p ?o ?g. }
- W3088396174 abstract "We propose an Anderson Acceleration (AA) scheme for the adaptive Expectation-Maximization (EM) algorithm for unsupervised learning a finite mixture model from multivariate data (Figueiredo and Jain 2002). The proposed algorithm is able to determine the optimal number of mixture components autonomously, and converges to the optimal solution much faster than its non-accelerated version. The success of the AA-based algorithm stems from several developments rather than a single breakthrough (and without these, our tests demonstrate that AA fails catastrophically). To begin, we ensure the monotonicity of the likelihood function (a the key feature of the standard EM algorithm) with a recently proposed monotonicity-control algorithm (Henderson and Varahdan 2019), enhanced by a novel monotonicity test with little overhead. We propose nimble strategies for AA to preserve the positive definiteness of the Gaussian weights and covariance matrices strictly, and to conserve up to the second moments of the observed data set exactly. Finally, we employ a K-means clustering algorithm using the gap statistic to avoid excessively overestimating the initial number of components, thereby maximizing performance. We demonstrate the accuracy and efficiency of the algorithm with several synthetic data sets that are mixtures of Gaussians distributions of known number of components, as well as data sets generated from particle-in-cell simulations. Our numerical results demonstrate speed-ups with respect to non-accelerated EM of up to 60X when the exact number of mixture components is known, and between a few and more than an order of magnitude with component adaptivity." @default.
- W3088396174 created "2020-10-01" @default.
- W3088396174 creator A5022993159 @default.
- W3088396174 creator A5064653269 @default.
- W3088396174 creator A5077485575 @default.
- W3088396174 date "2020-09-26" @default.
- W3088396174 modified "2023-10-11" @default.
- W3088396174 title "An Adaptive EM Accelerator for Unsupervised Learning of Gaussian Mixture Models." @default.
- W3088396174 cites W1503398984 @default.
- W3088396174 cites W1506806321 @default.
- W3088396174 cites W1524403460 @default.
- W3088396174 cites W1534506107 @default.
- W3088396174 cites W1548390947 @default.
- W3088396174 cites W1559947614 @default.
- W3088396174 cites W1579271636 @default.
- W3088396174 cites W1732489270 @default.
- W3088396174 cites W1824865661 @default.
- W3088396174 cites W194242946 @default.
- W3088396174 cites W1952261593 @default.
- W3088396174 cites W1953544038 @default.
- W3088396174 cites W1966801132 @default.
- W3088396174 cites W1967639437 @default.
- W3088396174 cites W1973217014 @default.
- W3088396174 cites W1981367467 @default.
- W3088396174 cites W1987971958 @default.
- W3088396174 cites W1995643764 @default.
- W3088396174 cites W2008049885 @default.
- W3088396174 cites W2015245929 @default.
- W3088396174 cites W2016789247 @default.
- W3088396174 cites W2021510280 @default.
- W3088396174 cites W2024476015 @default.
- W3088396174 cites W2029761439 @default.
- W3088396174 cites W2038281434 @default.
- W3088396174 cites W2041823554 @default.
- W3088396174 cites W2042154155 @default.
- W3088396174 cites W2046220244 @default.
- W3088396174 cites W2049633694 @default.
- W3088396174 cites W2060907140 @default.
- W3088396174 cites W2067773203 @default.
- W3088396174 cites W2071949631 @default.
- W3088396174 cites W2073459066 @default.
- W3088396174 cites W2074494639 @default.
- W3088396174 cites W2076007362 @default.
- W3088396174 cites W2091276705 @default.
- W3088396174 cites W2100514507 @default.
- W3088396174 cites W2117853077 @default.
- W3088396174 cites W2118254160 @default.
- W3088396174 cites W2118570622 @default.
- W3088396174 cites W2139575253 @default.
- W3088396174 cites W2142053777 @default.
- W3088396174 cites W2154185604 @default.
- W3088396174 cites W2155042384 @default.
- W3088396174 cites W2161623414 @default.
- W3088396174 cites W2166698530 @default.
- W3088396174 cites W21730848 @default.
- W3088396174 cites W2400985439 @default.
- W3088396174 cites W2414947611 @default.
- W3088396174 cites W2486584608 @default.
- W3088396174 cites W2524721548 @default.
- W3088396174 cites W2595142274 @default.
- W3088396174 cites W2702148523 @default.
- W3088396174 cites W2781183410 @default.
- W3088396174 cites W2803949462 @default.
- W3088396174 cites W2807398400 @default.
- W3088396174 cites W2910769969 @default.
- W3088396174 cites W2943526081 @default.
- W3088396174 cites W2952558372 @default.
- W3088396174 cites W2962938426 @default.
- W3088396174 cites W2990303212 @default.
- W3088396174 cites W2992724381 @default.
- W3088396174 cites W2999406207 @default.
- W3088396174 cites W3000878306 @default.
- W3088396174 cites W3013713234 @default.
- W3088396174 cites W3045162103 @default.
- W3088396174 cites W3140968660 @default.
- W3088396174 cites W61015563 @default.
- W3088396174 cites W794304800 @default.
- W3088396174 hasPublicationYear "2020" @default.
- W3088396174 type Work @default.
- W3088396174 sameAs 3088396174 @default.
- W3088396174 citedByCount "0" @default.
- W3088396174 crossrefType "posted-content" @default.
- W3088396174 hasAuthorship W3088396174A5022993159 @default.
- W3088396174 hasAuthorship W3088396174A5064653269 @default.
- W3088396174 hasAuthorship W3088396174A5077485575 @default.
- W3088396174 hasConcept C105795698 @default.
- W3088396174 hasConcept C11413529 @default.
- W3088396174 hasConcept C121332964 @default.
- W3088396174 hasConcept C126255220 @default.
- W3088396174 hasConcept C134306372 @default.
- W3088396174 hasConcept C14036430 @default.
- W3088396174 hasConcept C154945302 @default.
- W3088396174 hasConcept C163716315 @default.
- W3088396174 hasConcept C178650346 @default.
- W3088396174 hasConcept C182081679 @default.
- W3088396174 hasConcept C33923547 @default.
- W3088396174 hasConcept C41008148 @default.
- W3088396174 hasConcept C49781872 @default.
- W3088396174 hasConcept C61224824 @default.
- W3088396174 hasConcept C62520636 @default.