Matches in SemOpenAlex for { <https://semopenalex.org/work/W200068048> ?p ?o ?g. }
Showing items 1 to 76 of
76
with 100 items per page.
- W200068048 abstract "Deep Belief Networks (DBNs) are state-of-art Machine Learning techniques and one of the most important unsupervised learning algorithms. Training DBNs is computationally intensive which naturally leads to investigate FPGA acceleration. Fixed-point arithmetic can be used when implementing DBNs in FPGAs to reduce execution time, but it is not clear the implications for accuracy. Previous studies have focused only on accelerators using some fixed bit-widths. A contribution of this paper is to demonstrate the bit-width effect on various configurations of DBNs in a comprehensive way by experimental evaluation. Our work is inspired by the original DBN built on a subset of neural networks known as Restricted Boltzmann Machine (RBM) and the idea of Stacked Denoising Auto-Encoder (SDAE). We modified the floating-point versions of the original DBN and the denoising DBN (dDN) into fixed-point versions and compared their performance. Explicit performance changing points are found using various bit-widths. The results indicate that different configurations of DBNs have different performance changing points. The performance variations of three layers DBNs are a little larger than one layer DBNs because of the better sensitivity of deeper DBN. Sigmoid function approximation methods must be used when implementing DBNs in FPGA. The impacts of Piecewise Linear Approximation of nonlinearity algorithms (PLA) with two different precisions are evaluated quantitatively in our experiments. Modern FPGAs supply built-in primitives to support matrix operations including multiplications, accumulations and additions, which are the main operations of DBNs. A solution of mixed bit-widths DBN is proposed that a narrower bitwidth can be used for neural units and a wider one can be used for weights, thus fitting the bit-widths of FPGA primitives and gaining similar performance to the software implementation. Our results provide a guide to inform the design choices on bit-widths when implementing DBNs in FPGAs documenting clearly the trade-off in accuracy." @default.
- W200068048 created "2016-06-24" @default.
- W200068048 creator A5026069638 @default.
- W200068048 creator A5042948288 @default.
- W200068048 creator A5051680867 @default.
- W200068048 creator A5069310600 @default.
- W200068048 date "2013-01-01" @default.
- W200068048 modified "2023-09-30" @default.
- W200068048 title "Empirical Evaluation of Fixed-Point Arithmetic for Deep Belief Networks" @default.
- W200068048 doi "https://doi.org/10.1007/978-3-642-36812-7_28" @default.
- W200068048 hasPublicationYear "2013" @default.
- W200068048 type Work @default.
- W200068048 sameAs 200068048 @default.
- W200068048 citedByCount "0" @default.
- W200068048 crossrefType "book-chapter" @default.
- W200068048 hasAuthorship W200068048A5026069638 @default.
- W200068048 hasAuthorship W200068048A5042948288 @default.
- W200068048 hasAuthorship W200068048A5051680867 @default.
- W200068048 hasAuthorship W200068048A5069310600 @default.
- W200068048 hasConcept C108583219 @default.
- W200068048 hasConcept C11413529 @default.
- W200068048 hasConcept C134306372 @default.
- W200068048 hasConcept C154945302 @default.
- W200068048 hasConcept C163973906 @default.
- W200068048 hasConcept C192576344 @default.
- W200068048 hasConcept C199354608 @default.
- W200068048 hasConcept C33923547 @default.
- W200068048 hasConcept C41008148 @default.
- W200068048 hasConcept C42935608 @default.
- W200068048 hasConcept C50644808 @default.
- W200068048 hasConcept C61445026 @default.
- W200068048 hasConcept C84211073 @default.
- W200068048 hasConcept C9390403 @default.
- W200068048 hasConcept C97385483 @default.
- W200068048 hasConceptScore W200068048C108583219 @default.
- W200068048 hasConceptScore W200068048C11413529 @default.
- W200068048 hasConceptScore W200068048C134306372 @default.
- W200068048 hasConceptScore W200068048C154945302 @default.
- W200068048 hasConceptScore W200068048C163973906 @default.
- W200068048 hasConceptScore W200068048C192576344 @default.
- W200068048 hasConceptScore W200068048C199354608 @default.
- W200068048 hasConceptScore W200068048C33923547 @default.
- W200068048 hasConceptScore W200068048C41008148 @default.
- W200068048 hasConceptScore W200068048C42935608 @default.
- W200068048 hasConceptScore W200068048C50644808 @default.
- W200068048 hasConceptScore W200068048C61445026 @default.
- W200068048 hasConceptScore W200068048C84211073 @default.
- W200068048 hasConceptScore W200068048C9390403 @default.
- W200068048 hasConceptScore W200068048C97385483 @default.
- W200068048 hasLocation W2000680481 @default.
- W200068048 hasOpenAccess W200068048 @default.
- W200068048 hasPrimaryLocation W2000680481 @default.
- W200068048 hasRelatedWork W2766040906 @default.
- W200068048 hasRelatedWork W2766689678 @default.
- W200068048 hasRelatedWork W2787513570 @default.
- W200068048 hasRelatedWork W2798751785 @default.
- W200068048 hasRelatedWork W2799238091 @default.
- W200068048 hasRelatedWork W2870050732 @default.
- W200068048 hasRelatedWork W2897159959 @default.
- W200068048 hasRelatedWork W2903260438 @default.
- W200068048 hasRelatedWork W2977673565 @default.
- W200068048 hasRelatedWork W2999599167 @default.
- W200068048 hasRelatedWork W3000556572 @default.
- W200068048 hasRelatedWork W3023943564 @default.
- W200068048 hasRelatedWork W3033572831 @default.
- W200068048 hasRelatedWork W3035718760 @default.
- W200068048 hasRelatedWork W3038013679 @default.
- W200068048 hasRelatedWork W3095615584 @default.
- W200068048 hasRelatedWork W3166274355 @default.
- W200068048 hasRelatedWork W3168396991 @default.
- W200068048 hasRelatedWork W3168810591 @default.
- W200068048 hasRelatedWork W3199269302 @default.
- W200068048 isParatext "false" @default.
- W200068048 isRetracted "false" @default.
- W200068048 magId "200068048" @default.
- W200068048 workType "book-chapter" @default.