Matches in SemOpenAlex for { <https://semopenalex.org/work/W142302016> ?p ?o ?g. }
- W142302016 abstract "Feedforward neural networks are massively parallel computing structures that have the capability of universal function approximation. The most prevalent realisation of neural nets is in the form of an algorithm implemented in a computer program. Neural networks as computer programs lose the inher- ent parallism. Parallism can only be recovered by executing the program on an expensive parallel digital computer. Achievement of the inherent massive parallelism at a lower cost requires direct hardware realisation of the neural net. Such hardware has been developed jointly by QUT and the Heinz Nixdorf Institute (Germany) called the Local Cluster Neural Network (LCNN) chip. But this neural net chip lacks the capability of in-circuit learning or on-chip training. The weights for the analogue LCNN network have to be computed o® chip on a digital computer. Based on the previous work, this research focuses on the Local Cluster Neu- ral Network and its analogue chip. The characteristic of the LCNN chip was measured exhaustively and its behaviours were compared to the theoretical functionality of the LCNN. To overcome the manufacturing °uctuations and deviations presented in analogue circuits, we used chip-in-the-loop strategy for training of the LCNN chip. A new training algorithm: Probabilistic Random Weight Change for the chip-in-the-loop training for function approximation. In order to implement the LCNN analogue chip with on-chip training, two training algorithms are studied in on-line training mode in simulations: the Probabilistic Random Weight Change (PRWC) algorithm and the modified Gradient Descent (GD) algorithm. The circuits design for the PRWC on-chip training and the GD on-chip training are outlined. These two methods are compared for their training performance and the complexity of their circuits. This research provides the foundation for the next version of LCNN analogue hardware implementation." @default.
- W142302016 created "2016-06-24" @default.
- W142302016 creator A5066599847 @default.
- W142302016 date "2007-01-01" @default.
- W142302016 modified "2023-09-22" @default.
- W142302016 title "Parallel training algorithms for analogue hardware neural nets" @default.
- W142302016 cites W1559271389 @default.
- W142302016 cites W1586906451 @default.
- W142302016 cites W1870868961 @default.
- W142302016 cites W1971735090 @default.
- W142302016 cites W1989730800 @default.
- W142302016 cites W1995341919 @default.
- W142302016 cites W2024060531 @default.
- W142302016 cites W2033339351 @default.
- W142302016 cites W2055789736 @default.
- W142302016 cites W2071774663 @default.
- W142302016 cites W2096634070 @default.
- W142302016 cites W2103496339 @default.
- W142302016 cites W2107725879 @default.
- W142302016 cites W2109302642 @default.
- W142302016 cites W2119637323 @default.
- W142302016 cites W2121423846 @default.
- W142302016 cites W2132211083 @default.
- W142302016 cites W2133669639 @default.
- W142302016 cites W2133671888 @default.
- W142302016 cites W2135131646 @default.
- W142302016 cites W2137983211 @default.
- W142302016 cites W2142592520 @default.
- W142302016 cites W2143956139 @default.
- W142302016 cites W2148639079 @default.
- W142302016 cites W2150535417 @default.
- W142302016 cites W2161209569 @default.
- W142302016 cites W2161574926 @default.
- W142302016 cites W2164845947 @default.
- W142302016 cites W2167232388 @default.
- W142302016 cites W2171050036 @default.
- W142302016 cites W2171277043 @default.
- W142302016 cites W2322002063 @default.
- W142302016 cites W2491207981 @default.
- W142302016 cites W369831446 @default.
- W142302016 cites W599808240 @default.
- W142302016 hasPublicationYear "2007" @default.
- W142302016 type Work @default.
- W142302016 sameAs 142302016 @default.
- W142302016 citedByCount "0" @default.
- W142302016 crossrefType "dissertation" @default.
- W142302016 hasAuthorship W142302016A5066599847 @default.
- W142302016 hasConcept C113775141 @default.
- W142302016 hasConcept C11413529 @default.
- W142302016 hasConcept C127413603 @default.
- W142302016 hasConcept C133731056 @default.
- W142302016 hasConcept C153258448 @default.
- W142302016 hasConcept C154945302 @default.
- W142302016 hasConcept C165005293 @default.
- W142302016 hasConcept C173608175 @default.
- W142302016 hasConcept C190475519 @default.
- W142302016 hasConcept C38858127 @default.
- W142302016 hasConcept C41008148 @default.
- W142302016 hasConcept C50644808 @default.
- W142302016 hasConcept C76155785 @default.
- W142302016 hasConcept C9390403 @default.
- W142302016 hasConceptScore W142302016C113775141 @default.
- W142302016 hasConceptScore W142302016C11413529 @default.
- W142302016 hasConceptScore W142302016C127413603 @default.
- W142302016 hasConceptScore W142302016C133731056 @default.
- W142302016 hasConceptScore W142302016C153258448 @default.
- W142302016 hasConceptScore W142302016C154945302 @default.
- W142302016 hasConceptScore W142302016C165005293 @default.
- W142302016 hasConceptScore W142302016C173608175 @default.
- W142302016 hasConceptScore W142302016C190475519 @default.
- W142302016 hasConceptScore W142302016C38858127 @default.
- W142302016 hasConceptScore W142302016C41008148 @default.
- W142302016 hasConceptScore W142302016C50644808 @default.
- W142302016 hasConceptScore W142302016C76155785 @default.
- W142302016 hasConceptScore W142302016C9390403 @default.
- W142302016 hasLocation W1423020161 @default.
- W142302016 hasOpenAccess W142302016 @default.
- W142302016 hasPrimaryLocation W1423020161 @default.
- W142302016 hasRelatedWork W1031026794 @default.
- W142302016 hasRelatedWork W1525333077 @default.
- W142302016 hasRelatedWork W173022878 @default.
- W142302016 hasRelatedWork W2039867084 @default.
- W142302016 hasRelatedWork W2101034816 @default.
- W142302016 hasRelatedWork W2501540368 @default.
- W142302016 hasRelatedWork W2735040266 @default.
- W142302016 hasRelatedWork W2739310270 @default.
- W142302016 hasRelatedWork W2755219797 @default.
- W142302016 hasRelatedWork W2768050893 @default.
- W142302016 hasRelatedWork W2786635218 @default.
- W142302016 hasRelatedWork W2911355096 @default.
- W142302016 hasRelatedWork W2995582687 @default.
- W142302016 hasRelatedWork W2998561782 @default.
- W142302016 hasRelatedWork W3008608336 @default.
- W142302016 hasRelatedWork W3023503384 @default.
- W142302016 hasRelatedWork W3042184340 @default.
- W142302016 hasRelatedWork W3092443051 @default.
- W142302016 hasRelatedWork W3128540131 @default.
- W142302016 hasRelatedWork W3198877855 @default.
- W142302016 isParatext "false" @default.
- W142302016 isRetracted "false" @default.