Matches in SemOpenAlex for { <https://semopenalex.org/work/W4386977648> ?p ?o ?g. }
Showing items 1 to 49 of
49
with 100 items per page.
- W4386977648 abstract "Efficient training of large-scale graph neural networks (GNNs) has been studied with a specific focus on reducing their memory consumption. Work by Liu et al. (2022) proposed extreme activation compression (EXACT) which demonstrated drastic reduction in memory consumption by performing quantization of the intermediate activation maps down to using INT2 precision. They showed little to no reduction in performance while achieving large reductions in GPU memory consumption. In this work, we present an improvement to the EXACT strategy by using block-wise quantization of the intermediate activation maps. We experimentally analyze different block sizes and show further reduction in memory consumption (>15%), and runtime speedup per epoch (about 5%) even when performing extreme extents of quantization with similar performance trade-offs as with the original EXACT. Further, we present a correction to the assumptions on the distribution of intermediate activation maps in EXACT (assumed to be uniform) and show improved variance estimations of the quantization and dequantization steps." @default.
- W4386977648 created "2023-09-23" @default.
- W4386977648 creator A5063821969 @default.
- W4386977648 creator A5092926018 @default.
- W4386977648 date "2023-09-21" @default.
- W4386977648 modified "2023-10-18" @default.
- W4386977648 title "Activation Compression of Graph Neural Networks using Block-wise Quantization with Improved Variance Minimization" @default.
- W4386977648 doi "https://doi.org/10.48550/arxiv.2309.11856" @default.
- W4386977648 hasPublicationYear "2023" @default.
- W4386977648 type Work @default.
- W4386977648 citedByCount "0" @default.
- W4386977648 crossrefType "posted-content" @default.
- W4386977648 hasAuthorship W4386977648A5063821969 @default.
- W4386977648 hasAuthorship W4386977648A5092926018 @default.
- W4386977648 hasBestOaLocation W43869776481 @default.
- W4386977648 hasConcept C11413529 @default.
- W4386977648 hasConcept C132525143 @default.
- W4386977648 hasConcept C147764199 @default.
- W4386977648 hasConcept C173608175 @default.
- W4386977648 hasConcept C199360897 @default.
- W4386977648 hasConcept C28855332 @default.
- W4386977648 hasConcept C41008148 @default.
- W4386977648 hasConcept C68339613 @default.
- W4386977648 hasConcept C80444323 @default.
- W4386977648 hasConceptScore W4386977648C11413529 @default.
- W4386977648 hasConceptScore W4386977648C132525143 @default.
- W4386977648 hasConceptScore W4386977648C147764199 @default.
- W4386977648 hasConceptScore W4386977648C173608175 @default.
- W4386977648 hasConceptScore W4386977648C199360897 @default.
- W4386977648 hasConceptScore W4386977648C28855332 @default.
- W4386977648 hasConceptScore W4386977648C41008148 @default.
- W4386977648 hasConceptScore W4386977648C68339613 @default.
- W4386977648 hasConceptScore W4386977648C80444323 @default.
- W4386977648 hasLocation W43869776481 @default.
- W4386977648 hasOpenAccess W4386977648 @default.
- W4386977648 hasPrimaryLocation W43869776481 @default.
- W4386977648 hasRelatedWork W1509211761 @default.
- W4386977648 hasRelatedWork W1531488649 @default.
- W4386977648 hasRelatedWork W1583465708 @default.
- W4386977648 hasRelatedWork W1585350690 @default.
- W4386977648 hasRelatedWork W2133693067 @default.
- W4386977648 hasRelatedWork W2366027386 @default.
- W4386977648 hasRelatedWork W2391299576 @default.
- W4386977648 hasRelatedWork W2582456645 @default.
- W4386977648 hasRelatedWork W3037767301 @default.
- W4386977648 hasRelatedWork W2479014312 @default.
- W4386977648 isParatext "false" @default.
- W4386977648 isRetracted "false" @default.
- W4386977648 workType "article" @default.