Matches in SemOpenAlex for { <https://semopenalex.org/work/W2951232050> ?p ?o ?g. }
- W2951232050 abstract "In distributed statistical learning, $N$ samples are split across $m$ machines and a learner wishes to use minimal communication to learn as well as if the examples were on a single machine. This model has received substantial interest in machine learning due to its scalability and potential for parallel speedup. However, in high-dimensional settings, where the number examples is smaller than the number of features (dimension), the speedup afforded by distributed learning may be overshadowed by the cost of communicating a single example. This paper investigates the following question: When is it possible to learn a $d$-dimensional model in the distributed setting with total communication sublinear in $d$? Starting with a negative result, we show that for learning $ell_1$-bounded or sparse linear models, no algorithm can obtain optimal error until communication is linear in dimension. Our main result is that that by slightly relaxing the standard boundedness assumptions for linear models, we can obtain distributed algorithms that enjoy optimal error with communication logarithmic in dimension. This result is based on a family of algorithms that combine mirror descent with randomized sparsification/quantization of iterates, and extends to the general stochastic convex optimization model." @default.
- W2951232050 created "2019-06-27" @default.
- W2951232050 creator A5022867638 @default.
- W2951232050 creator A5041869459 @default.
- W2951232050 creator A5075069827 @default.
- W2951232050 creator A5084265089 @default.
- W2951232050 date "2019-02-28" @default.
- W2951232050 modified "2023-09-27" @default.
- W2951232050 title "Distributed Learning with Sublinear Communication" @default.
- W2951232050 cites W1568307856 @default.
- W2951232050 cites W1570963478 @default.
- W2951232050 cites W1575244755 @default.
- W2951232050 cites W1626317705 @default.
- W2951232050 cites W1870267105 @default.
- W2951232050 cites W1963961606 @default.
- W2951232050 cites W1980404857 @default.
- W2951232050 cites W1994520254 @default.
- W2951232050 cites W2009537245 @default.
- W2951232050 cites W2044828368 @default.
- W2951232050 cites W2111377143 @default.
- W2951232050 cites W2114773674 @default.
- W2951232050 cites W2130062883 @default.
- W2951232050 cites W2138243089 @default.
- W2951232050 cites W2148825261 @default.
- W2951232050 cites W2160354932 @default.
- W2951232050 cites W2166116275 @default.
- W2951232050 cites W2170912114 @default.
- W2951232050 cites W2172272342 @default.
- W2951232050 cites W2188647300 @default.
- W2951232050 cites W2194775991 @default.
- W2951232050 cites W2301987905 @default.
- W2951232050 cites W2407022425 @default.
- W2951232050 cites W2513180554 @default.
- W2951232050 cites W2606082375 @default.
- W2951232050 cites W2741269719 @default.
- W2951232050 cites W2796494761 @default.
- W2951232050 cites W2886974017 @default.
- W2951232050 cites W2889676205 @default.
- W2951232050 cites W2890924858 @default.
- W2951232050 cites W2903586563 @default.
- W2951232050 cites W2907436114 @default.
- W2951232050 cites W2916782324 @default.
- W2951232050 cites W2951781666 @default.
- W2951232050 cites W2952859719 @default.
- W2951232050 cites W2962929471 @default.
- W2951232050 cites W2963422939 @default.
- W2951232050 cites W2963664311 @default.
- W2951232050 cites W2963766684 @default.
- W2951232050 cites W2964181194 @default.
- W2951232050 cites W2964346891 @default.
- W2951232050 cites W3013820469 @default.
- W2951232050 cites W3215641518 @default.
- W2951232050 cites W607505555 @default.
- W2951232050 hasPublicationYear "2019" @default.
- W2951232050 type Work @default.
- W2951232050 sameAs 2951232050 @default.
- W2951232050 citedByCount "3" @default.
- W2951232050 countsByYear W29512320502019 @default.
- W2951232050 countsByYear W29512320502020 @default.
- W2951232050 crossrefType "posted-content" @default.
- W2951232050 hasAuthorship W2951232050A5022867638 @default.
- W2951232050 hasAuthorship W2951232050A5041869459 @default.
- W2951232050 hasAuthorship W2951232050A5075069827 @default.
- W2951232050 hasAuthorship W2951232050A5084265089 @default.
- W2951232050 hasConcept C11413529 @default.
- W2951232050 hasConcept C114614502 @default.
- W2951232050 hasConcept C117160843 @default.
- W2951232050 hasConcept C118615104 @default.
- W2951232050 hasConcept C120314980 @default.
- W2951232050 hasConcept C126255220 @default.
- W2951232050 hasConcept C130120984 @default.
- W2951232050 hasConcept C134306372 @default.
- W2951232050 hasConcept C140479938 @default.
- W2951232050 hasConcept C173608175 @default.
- W2951232050 hasConcept C28855332 @default.
- W2951232050 hasConcept C33676613 @default.
- W2951232050 hasConcept C33923547 @default.
- W2951232050 hasConcept C34388435 @default.
- W2951232050 hasConcept C39927690 @default.
- W2951232050 hasConcept C41008148 @default.
- W2951232050 hasConcept C48044578 @default.
- W2951232050 hasConcept C68339613 @default.
- W2951232050 hasConcept C77088390 @default.
- W2951232050 hasConceptScore W2951232050C11413529 @default.
- W2951232050 hasConceptScore W2951232050C114614502 @default.
- W2951232050 hasConceptScore W2951232050C117160843 @default.
- W2951232050 hasConceptScore W2951232050C118615104 @default.
- W2951232050 hasConceptScore W2951232050C120314980 @default.
- W2951232050 hasConceptScore W2951232050C126255220 @default.
- W2951232050 hasConceptScore W2951232050C130120984 @default.
- W2951232050 hasConceptScore W2951232050C134306372 @default.
- W2951232050 hasConceptScore W2951232050C140479938 @default.
- W2951232050 hasConceptScore W2951232050C173608175 @default.
- W2951232050 hasConceptScore W2951232050C28855332 @default.
- W2951232050 hasConceptScore W2951232050C33676613 @default.
- W2951232050 hasConceptScore W2951232050C33923547 @default.
- W2951232050 hasConceptScore W2951232050C34388435 @default.
- W2951232050 hasConceptScore W2951232050C39927690 @default.
- W2951232050 hasConceptScore W2951232050C41008148 @default.
- W2951232050 hasConceptScore W2951232050C48044578 @default.