Matches in SemOpenAlex for { <https://semopenalex.org/work/W4367671331> ?p ?o ?g. }
Showing items 1 to 57 of
57
with 100 items per page.
- W4367671331 endingPage "150" @default.
- W4367671331 startingPage "138" @default.
- W4367671331 abstract "In this short note, we propose a new method for quantizing the weights of a fully trained neural network. A simple deterministic pre-processing step allows us to quantize network layers via memoryless scalar quantization while preserving the network performance on given training data. On one hand, the computational complexity of this pre-processing slightly exceeds that of state-of-the-art algorithms in the literature. On the other hand, our approach does not require any hyper-parameter tuning and, in contrast to previous methods, allows a plain analysis. We provide rigorous theoretical guarantees in the case of quantizing single network layers and show that the relative error decays with the number of parameters in the network if the training data behave well, e.g., if it is sampled from suitable random distributions. The developed method also readily allows the quantization of deep networks by consecutive application to single layers." @default.
- W4367671331 created "2023-05-03" @default.
- W4367671331 creator A5055778695 @default.
- W4367671331 creator A5063241706 @default.
- W4367671331 date "2023-09-01" @default.
- W4367671331 modified "2023-09-27" @default.
- W4367671331 title "A simple approach for quantizing neural networks" @default.
- W4367671331 cites W2076063813 @default.
- W4367671331 cites W2919115771 @default.
- W4367671331 cites W3012561096 @default.
- W4367671331 doi "https://doi.org/10.1016/j.acha.2023.04.004" @default.
- W4367671331 hasPublicationYear "2023" @default.
- W4367671331 type Work @default.
- W4367671331 citedByCount "0" @default.
- W4367671331 crossrefType "journal-article" @default.
- W4367671331 hasAuthorship W4367671331A5055778695 @default.
- W4367671331 hasAuthorship W4367671331A5063241706 @default.
- W4367671331 hasBestOaLocation W43676713312 @default.
- W4367671331 hasConcept C111472728 @default.
- W4367671331 hasConcept C11413529 @default.
- W4367671331 hasConcept C138885662 @default.
- W4367671331 hasConcept C154945302 @default.
- W4367671331 hasConcept C2780586882 @default.
- W4367671331 hasConcept C28855332 @default.
- W4367671331 hasConcept C33923547 @default.
- W4367671331 hasConcept C41008148 @default.
- W4367671331 hasConcept C50644808 @default.
- W4367671331 hasConceptScore W4367671331C111472728 @default.
- W4367671331 hasConceptScore W4367671331C11413529 @default.
- W4367671331 hasConceptScore W4367671331C138885662 @default.
- W4367671331 hasConceptScore W4367671331C154945302 @default.
- W4367671331 hasConceptScore W4367671331C2780586882 @default.
- W4367671331 hasConceptScore W4367671331C28855332 @default.
- W4367671331 hasConceptScore W4367671331C33923547 @default.
- W4367671331 hasConceptScore W4367671331C41008148 @default.
- W4367671331 hasConceptScore W4367671331C50644808 @default.
- W4367671331 hasFunder F4320306076 @default.
- W4367671331 hasLocation W43676713311 @default.
- W4367671331 hasLocation W43676713312 @default.
- W4367671331 hasOpenAccess W4367671331 @default.
- W4367671331 hasPrimaryLocation W43676713311 @default.
- W4367671331 hasRelatedWork W1887191277 @default.
- W4367671331 hasRelatedWork W2018619927 @default.
- W4367671331 hasRelatedWork W2054072800 @default.
- W4367671331 hasRelatedWork W2334954212 @default.
- W4367671331 hasRelatedWork W2386387936 @default.
- W4367671331 hasRelatedWork W2392110728 @default.
- W4367671331 hasRelatedWork W2964285269 @default.
- W4367671331 hasRelatedWork W4235288607 @default.
- W4367671331 hasRelatedWork W4238075012 @default.
- W4367671331 hasRelatedWork W4248389398 @default.
- W4367671331 hasVolume "66" @default.
- W4367671331 isParatext "false" @default.
- W4367671331 isRetracted "false" @default.
- W4367671331 workType "article" @default.