Matches in SemOpenAlex for { <https://semopenalex.org/work/W2135776909> ?p ?o ?g. }
- W2135776909 endingPage "110" @default.
- W2135776909 startingPage "91" @default.
- W2135776909 abstract "In this article we present a methodology that partially pre-calculates the weight updates of the backpropagation learning regime and obtains high accuracy function mapping. The paper shows how to implement neural units in a digital formulation which enables the weights to be quantised to 8-bits and the activations to 9-bits. A novel methodology is introduced to enable the accuracy of sigma-pi units to be increased by expanding their internal state space. We, also, introduce a novel means of implementing bit-streams in ring memories instead of utilising shift registers. The investigation utilises digital Higher Order sigma-pi nodes and studies continuous input RAM-based sigma-pi units. The units are trained with the backpropagation learning regime to learn functions to a high accuracy. The neural model is the sigma-pi units which can be implemented in digital microelectronic technology. The ability to perform tasks that require the input of real-valued information, is one of the central requirements of any cognitive system that utilises artificial neural network methodologies. In this article we present recent research which investigates a technique that can be used for mapping accurate real-valued functions to RAM-nets. One of our goals was to achieve accuracies of better than 1% for target output functions in the range Y epsilon [0,1], this is equivalent to an average Mean Square Error (MSE) over all training vectors of 0.0001 or an error modulus of 0.01. We present a development of the sigma-pi node which enables the provision of high accuracy outputs. The sigma-pi neural model was initially developed by Gurney (Learning in nets of structured hypercubes. PhD Thesis, Department of Electrical Engineering, Brunel University, Middlessex, UK, 1989; available as Technical Memo CN/R/144). Gurney's neuron models, the Time Integration Node (TIN), utilises an activation that was derived from a bit-stream. In this article we present a new methodology for storing sigma-pi node's activations as single values which are averages. In the course of the article we state what we define as a real number; how we represent real numbers and input of continuous values in our neural system. We show how to utilise the bounded quantised site-values (weights) of sigma-pi nodes to make training of these neurocomputing systems simple, using pre-calculated look-up tables to train the nets. In order to meet our accuracy goal, we introduce a means of increasing the bandwidth capability of sigma-pi units by expanding their internal state-space. In our implementation we utilise bit-streams when we calculate the real-valued outputs of the net. To simplify the hardware implementation of bit-streams we present a method of mapping them to RAM-based hardware using 'ring memories'. Finally, we study the sigma-pi units' ability to generalise once they are trained to map real-valued, high accuracy, continuous functions. We use sigma-pi units as they have been shown to have shorter training times than their analogue counterparts and can also overcome some of the drawbacks of semi-linear units (Gurney, 1992. Neural Networks, 5, 289-303)." @default.
- W2135776909 created "2016-06-24" @default.
- W2135776909 creator A5060218376 @default.
- W2135776909 creator A5063967078 @default.
- W2135776909 creator A5084522634 @default.
- W2135776909 date "2000-01-01" @default.
- W2135776909 modified "2023-10-16" @default.
- W2135776909 title "Partially pre-calculated weights for the backpropagation learning regime and high accuracy function mapping using continuous input RAM-based sigma–pi nets" @default.
- W2135776909 cites W129610485 @default.
- W2135776909 cites W149598276 @default.
- W2135776909 cites W1496940281 @default.
- W2135776909 cites W1566309885 @default.
- W2135776909 cites W18214825 @default.
- W2135776909 cites W1848942753 @default.
- W2135776909 cites W1971665073 @default.
- W2135776909 cites W1992675968 @default.
- W2135776909 cites W1995426156 @default.
- W2135776909 cites W1999476329 @default.
- W2135776909 cites W2000247974 @default.
- W2135776909 cites W2003849864 @default.
- W2135776909 cites W2009207660 @default.
- W2135776909 cites W2013101793 @default.
- W2135776909 cites W2021723443 @default.
- W2135776909 cites W2027406879 @default.
- W2135776909 cites W2038108866 @default.
- W2135776909 cites W2061191702 @default.
- W2135776909 cites W2080759927 @default.
- W2135776909 cites W2087480365 @default.
- W2135776909 cites W2088820996 @default.
- W2135776909 cites W2091565802 @default.
- W2135776909 cites W2094704691 @default.
- W2135776909 cites W2098247450 @default.
- W2135776909 cites W2126146865 @default.
- W2135776909 cites W2128233055 @default.
- W2135776909 cites W2130250615 @default.
- W2135776909 cites W2149518982 @default.
- W2135776909 cites W2161259239 @default.
- W2135776909 cites W2219077753 @default.
- W2135776909 cites W2274136587 @default.
- W2135776909 cites W2303146316 @default.
- W2135776909 cites W2328954291 @default.
- W2135776909 cites W2428525026 @default.
- W2135776909 cites W2468902298 @default.
- W2135776909 cites W2469322523 @default.
- W2135776909 cites W2747946841 @default.
- W2135776909 cites W2766736793 @default.
- W2135776909 cites W279108228 @default.
- W2135776909 cites W3121926921 @default.
- W2135776909 cites W397903816 @default.
- W2135776909 cites W408599159 @default.
- W2135776909 cites W58249740 @default.
- W2135776909 cites W62474489 @default.
- W2135776909 cites W2095480254 @default.
- W2135776909 cites W2739258691 @default.
- W2135776909 doi "https://doi.org/10.1016/s0893-6080(99)00102-1" @default.
- W2135776909 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/10935462" @default.
- W2135776909 hasPublicationYear "2000" @default.
- W2135776909 type Work @default.
- W2135776909 sameAs 2135776909 @default.
- W2135776909 citedByCount "12" @default.
- W2135776909 countsByYear W21357769092020 @default.
- W2135776909 crossrefType "journal-article" @default.
- W2135776909 hasAuthorship W2135776909A5060218376 @default.
- W2135776909 hasAuthorship W2135776909A5063967078 @default.
- W2135776909 hasAuthorship W2135776909A5084522634 @default.
- W2135776909 hasConcept C105795698 @default.
- W2135776909 hasConcept C11413529 @default.
- W2135776909 hasConcept C121332964 @default.
- W2135776909 hasConcept C127413603 @default.
- W2135776909 hasConcept C139945424 @default.
- W2135776909 hasConcept C14036430 @default.
- W2135776909 hasConcept C154945302 @default.
- W2135776909 hasConcept C155032097 @default.
- W2135776909 hasConcept C2778049214 @default.
- W2135776909 hasConcept C33923547 @default.
- W2135776909 hasConcept C41008148 @default.
- W2135776909 hasConcept C50644808 @default.
- W2135776909 hasConcept C62520636 @default.
- W2135776909 hasConcept C62611344 @default.
- W2135776909 hasConcept C66938386 @default.
- W2135776909 hasConcept C78458016 @default.
- W2135776909 hasConcept C86803240 @default.
- W2135776909 hasConceptScore W2135776909C105795698 @default.
- W2135776909 hasConceptScore W2135776909C11413529 @default.
- W2135776909 hasConceptScore W2135776909C121332964 @default.
- W2135776909 hasConceptScore W2135776909C127413603 @default.
- W2135776909 hasConceptScore W2135776909C139945424 @default.
- W2135776909 hasConceptScore W2135776909C14036430 @default.
- W2135776909 hasConceptScore W2135776909C154945302 @default.
- W2135776909 hasConceptScore W2135776909C155032097 @default.
- W2135776909 hasConceptScore W2135776909C2778049214 @default.
- W2135776909 hasConceptScore W2135776909C33923547 @default.
- W2135776909 hasConceptScore W2135776909C41008148 @default.
- W2135776909 hasConceptScore W2135776909C50644808 @default.
- W2135776909 hasConceptScore W2135776909C62520636 @default.
- W2135776909 hasConceptScore W2135776909C62611344 @default.
- W2135776909 hasConceptScore W2135776909C66938386 @default.
- W2135776909 hasConceptScore W2135776909C78458016 @default.