Matches in SemOpenAlex for { <https://semopenalex.org/work/W3012897490> ?p ?o ?g. }
- W3012897490 abstract "Large-scale training is important to ensure high performance and accuracy of machine-learning models. At Facebook we use many different models, including computer vision, video and language models. However, in this paper we focus on the deep learning recommendation models (DLRMs), which are responsible for more than 50% of the training demand in our data centers. Recommendation models present unique challenges in training because they exercise not only compute but also memory capacity as well as memory and network bandwidth. As model size and complexity increase, efficiently scaling training becomes a challenge. To address it we design Zion - Facebook's next-generation large-memory training platform that consists of both CPUs and accelerators. Also, we discuss the design requirements of future scale-out training systems." @default.
- W3012897490 created "2020-03-27" @default.
- W3012897490 creator A5007892541 @default.
- W3012897490 creator A5010761294 @default.
- W3012897490 creator A5020495890 @default.
- W3012897490 creator A5028514073 @default.
- W3012897490 creator A5033659558 @default.
- W3012897490 creator A5034003919 @default.
- W3012897490 creator A5038603364 @default.
- W3012897490 creator A5039508118 @default.
- W3012897490 creator A5040205022 @default.
- W3012897490 creator A5048005267 @default.
- W3012897490 creator A5058119792 @default.
- W3012897490 creator A5060639309 @default.
- W3012897490 creator A5066366329 @default.
- W3012897490 creator A5069729500 @default.
- W3012897490 creator A5071392807 @default.
- W3012897490 date "2020-03-20" @default.
- W3012897490 modified "2023-10-17" @default.
- W3012897490 title "Deep Learning Training in Facebook Data Centers: Design of Scale-up and Scale-out Systems." @default.
- W3012897490 cites W1442374986 @default.
- W3012897490 cites W1501077214 @default.
- W3012897490 cites W2048266589 @default.
- W3012897490 cites W2077556206 @default.
- W3012897490 cites W2130531694 @default.
- W3012897490 cites W2155388085 @default.
- W3012897490 cites W2165021839 @default.
- W3012897490 cites W2166706236 @default.
- W3012897490 cites W2168231600 @default.
- W3012897490 cites W2186615578 @default.
- W3012897490 cites W2287011250 @default.
- W3012897490 cites W2336650964 @default.
- W3012897490 cites W2512971201 @default.
- W3012897490 cites W2550821151 @default.
- W3012897490 cites W2606722458 @default.
- W3012897490 cites W2612387305 @default.
- W3012897490 cites W2626991402 @default.
- W3012897490 cites W2787998955 @default.
- W3012897490 cites W2794670651 @default.
- W3012897490 cites W2809273748 @default.
- W3012897490 cites W2899771611 @default.
- W3012897490 cites W2901839763 @default.
- W3012897490 cites W2926767350 @default.
- W3012897490 cites W2947737663 @default.
- W3012897490 cites W2950094539 @default.
- W3012897490 cites W2951897082 @default.
- W3012897490 cites W2979244362 @default.
- W3012897490 cites W2982219368 @default.
- W3012897490 cites W2996471668 @default.
- W3012897490 cites W3004495293 @default.
- W3012897490 cites W3009421818 @default.
- W3012897490 cites W3012121966 @default.
- W3012897490 cites W3016395792 @default.
- W3012897490 cites W3016842236 @default.
- W3012897490 hasPublicationYear "2020" @default.
- W3012897490 type Work @default.
- W3012897490 sameAs 3012897490 @default.
- W3012897490 citedByCount "20" @default.
- W3012897490 countsByYear W30128974902020 @default.
- W3012897490 countsByYear W30128974902021 @default.
- W3012897490 crossrefType "posted-content" @default.
- W3012897490 hasAuthorship W3012897490A5007892541 @default.
- W3012897490 hasAuthorship W3012897490A5010761294 @default.
- W3012897490 hasAuthorship W3012897490A5020495890 @default.
- W3012897490 hasAuthorship W3012897490A5028514073 @default.
- W3012897490 hasAuthorship W3012897490A5033659558 @default.
- W3012897490 hasAuthorship W3012897490A5034003919 @default.
- W3012897490 hasAuthorship W3012897490A5038603364 @default.
- W3012897490 hasAuthorship W3012897490A5039508118 @default.
- W3012897490 hasAuthorship W3012897490A5040205022 @default.
- W3012897490 hasAuthorship W3012897490A5048005267 @default.
- W3012897490 hasAuthorship W3012897490A5058119792 @default.
- W3012897490 hasAuthorship W3012897490A5060639309 @default.
- W3012897490 hasAuthorship W3012897490A5066366329 @default.
- W3012897490 hasAuthorship W3012897490A5069729500 @default.
- W3012897490 hasAuthorship W3012897490A5071392807 @default.
- W3012897490 hasConcept C108583219 @default.
- W3012897490 hasConcept C111919701 @default.
- W3012897490 hasConcept C119857082 @default.
- W3012897490 hasConcept C120665830 @default.
- W3012897490 hasConcept C121332964 @default.
- W3012897490 hasConcept C153294291 @default.
- W3012897490 hasConcept C154945302 @default.
- W3012897490 hasConcept C188045654 @default.
- W3012897490 hasConcept C192209626 @default.
- W3012897490 hasConcept C2522767166 @default.
- W3012897490 hasConcept C2524010 @default.
- W3012897490 hasConcept C2776257435 @default.
- W3012897490 hasConcept C2777211547 @default.
- W3012897490 hasConcept C2778755073 @default.
- W3012897490 hasConcept C31258907 @default.
- W3012897490 hasConcept C33923547 @default.
- W3012897490 hasConcept C41008148 @default.
- W3012897490 hasConcept C49774154 @default.
- W3012897490 hasConcept C51632099 @default.
- W3012897490 hasConcept C62520636 @default.
- W3012897490 hasConcept C99844830 @default.
- W3012897490 hasConceptScore W3012897490C108583219 @default.
- W3012897490 hasConceptScore W3012897490C111919701 @default.
- W3012897490 hasConceptScore W3012897490C119857082 @default.