Matches in SemOpenAlex for { <https://semopenalex.org/work/W2804383382> ?p ?o ?g. }
Showing items 1 to 69 of
69
with 100 items per page.
- W2804383382 abstract "In the paper, we consider the urgent need to create highly efficient hardware accelerators for machine learning algorithms, including convolutional and deep neural networks (CNN and DNNS), for associative memory models, clustering, and pattern recognition. These algorithms usually include a large number of multiply-accumulate (and the like) operations. We show a brief overview of our related works the advantages of the equivalent models (EM) for describing and designing neural networks and recognizing bio-inspired systems. The capacity of NN on the basis of EM and of its modifications, including auto-and hetero-associative memories for 2D images, is in several times quantity of neurons. Such neuroparadigms are very perspective for processing, clustering, recognition, storing large size and strongly correlated and highly noised images. They are also very promising for solving the problem of creating machine uncontrolled learning. And since the basic operational functional nodes of EM are such vector-matrix or matrix-tensor procedures with continuous-logical operations as: normalized vector operations equivalence, nonequivalence, autoequivalence, auto-nonequivalence, we consider in this paper new conceptual approaches to the design of full-scale arrays of such neuron-equivalentors (NEs) with extended functionality, including different activation functions. Our approach is based on the use of analog and mixed (with special coding) methods for implementing the required operations, building NEs (with number of synapsis from 8 up to 128 and more) and their base cells, nodes based on photosensitive elements and CMOS current mirrors. We show the results of modeling the proposed new modularscalable implementations of NEs, we estimates and compare them. Simulation results show that processing time in such circuits does not exceed units of micro seconds, and for some variants 50-100 nanoseconds. Circuits are simple, have low supply voltage (1.5 – 3.3 V), low power consumption (milliwatts), low levels of input signals (microwatts), integrated construction, satisfy the problem of interconnections and cascading. Signals at the output of such neurons can be both digital and analog, or hybrid, and also with two complement outputs. They realize principle of dualism which gives a number of advantages of such complement dual NEs." @default.
- W2804383382 created "2018-06-01" @default.
- W2804383382 creator A5000018615 @default.
- W2804383382 creator A5075366498 @default.
- W2804383382 creator A5090011327 @default.
- W2804383382 date "2018-05-21" @default.
- W2804383382 modified "2023-09-30" @default.
- W2804383382 title "Design and simulation of optoelectronic neuron equivalentors as hardware accelerators of self-learning equivalent convolutional neural structures (SLECNS)" @default.
- W2804383382 cites W1533908931 @default.
- W2804383382 cites W1999192586 @default.
- W2804383382 cites W2008887940 @default.
- W2804383382 cites W2112796928 @default.
- W2804383382 cites W2149992244 @default.
- W2804383382 cites W2794110395 @default.
- W2804383382 cites W4243519499 @default.
- W2804383382 doi "https://doi.org/10.1117/12.2316352" @default.
- W2804383382 hasPublicationYear "2018" @default.
- W2804383382 type Work @default.
- W2804383382 sameAs 2804383382 @default.
- W2804383382 citedByCount "0" @default.
- W2804383382 crossrefType "proceedings-article" @default.
- W2804383382 hasAuthorship W2804383382A5000018615 @default.
- W2804383382 hasAuthorship W2804383382A5075366498 @default.
- W2804383382 hasAuthorship W2804383382A5090011327 @default.
- W2804383382 hasConcept C113775141 @default.
- W2804383382 hasConcept C153180895 @default.
- W2804383382 hasConcept C154945302 @default.
- W2804383382 hasConcept C41008148 @default.
- W2804383382 hasConcept C50644808 @default.
- W2804383382 hasConcept C53442348 @default.
- W2804383382 hasConcept C73555534 @default.
- W2804383382 hasConcept C80444323 @default.
- W2804383382 hasConcept C81363708 @default.
- W2804383382 hasConceptScore W2804383382C113775141 @default.
- W2804383382 hasConceptScore W2804383382C153180895 @default.
- W2804383382 hasConceptScore W2804383382C154945302 @default.
- W2804383382 hasConceptScore W2804383382C41008148 @default.
- W2804383382 hasConceptScore W2804383382C50644808 @default.
- W2804383382 hasConceptScore W2804383382C53442348 @default.
- W2804383382 hasConceptScore W2804383382C73555534 @default.
- W2804383382 hasConceptScore W2804383382C80444323 @default.
- W2804383382 hasConceptScore W2804383382C81363708 @default.
- W2804383382 hasLocation W28043833821 @default.
- W2804383382 hasOpenAccess W2804383382 @default.
- W2804383382 hasPrimaryLocation W28043833821 @default.
- W2804383382 hasRelatedWork W133648044 @default.
- W2804383382 hasRelatedWork W1643571587 @default.
- W2804383382 hasRelatedWork W2004383590 @default.
- W2804383382 hasRelatedWork W2008887940 @default.
- W2804383382 hasRelatedWork W2091100861 @default.
- W2804383382 hasRelatedWork W2104683972 @default.
- W2804383382 hasRelatedWork W2320222525 @default.
- W2804383382 hasRelatedWork W2521822125 @default.
- W2804383382 hasRelatedWork W2530992305 @default.
- W2804383382 hasRelatedWork W2609872076 @default.
- W2804383382 hasRelatedWork W2743303586 @default.
- W2804383382 hasRelatedWork W2770155803 @default.
- W2804383382 hasRelatedWork W2781095682 @default.
- W2804383382 hasRelatedWork W2890495499 @default.
- W2804383382 hasRelatedWork W2946644855 @default.
- W2804383382 hasRelatedWork W3126449767 @default.
- W2804383382 hasRelatedWork W3200328751 @default.
- W2804383382 hasRelatedWork W3207084182 @default.
- W2804383382 hasRelatedWork W4220834261 @default.
- W2804383382 hasRelatedWork W965212114 @default.
- W2804383382 isParatext "false" @default.
- W2804383382 isRetracted "false" @default.
- W2804383382 magId "2804383382" @default.
- W2804383382 workType "article" @default.