Matches in SemOpenAlex for { <https://semopenalex.org/work/W4386475999> ?p ?o ?g. }
- W4386475999 endingPage "7701" @default.
- W4386475999 startingPage "7701" @default.
- W4386475999 abstract "Over the past decade, the artificial neural networks domain has seen a considerable embracement of deep neural networks among many applications. However, deep neural networks are typically computationally complex and consume high power, hindering their applicability for resource-constrained applications, such as self-driving vehicles, drones, and robotics. Spiking neural networks, often employed to bridge the gap between machine learning and neuroscience fields, are considered a promising solution for resource-constrained applications. Since deploying spiking neural networks on traditional von-Newman architectures requires significant processing time and high power, typically, neuromorphic hardware is created to execute spiking neural networks. The objective of neuromorphic devices is to mimic the distinctive functionalities of the human brain in terms of energy efficiency, computational power, and robust learning. Furthermore, natural language processing, a machine learning technique, has been widely utilized to aid machines in comprehending human language. However, natural language processing techniques cannot also be deployed efficiently on traditional computing platforms. In this research work, we strive to enhance the natural language processing traits/abilities by harnessing and integrating the SNNs traits, as well as deploying the integrated solution on neuromorphic hardware, efficiently and effectively. To facilitate this endeavor, we propose a novel, unique, and efficient sentiment analysis model created using a large-scale SNN model on SpiNNaker neuromorphic hardware that responds to user inputs. SpiNNaker neuromorphic hardware typically can simulate large spiking neural networks in real time and consumes low power. We initially create an artificial neural networks model, and then train the model using an Internet Movie Database (IMDB) dataset. Next, the pre-trained artificial neural networks model is converted into our proposed spiking neural networks model, called a spiking sentiment analysis (SSA) model. Our SSA model using SpiNNaker, called SSA-SpiNNaker, is created in such a way to respond to user inputs with a positive or negative response. Our proposed SSA-SpiNNaker model achieves 100% accuracy and only consumes 3970 Joules of energy, while processing around 10,000 words and predicting a positive/negative review. Our experimental results and analysis demonstrate that by leveraging the parallel and distributed capabilities of SpiNNaker, our proposed SSA-SpiNNaker model achieves better performance compared to artificial neural networks models. Our investigation into existing works revealed that no similar models exist in the published literature, demonstrating the uniqueness of our proposed model. Our proposed work would offer a synergy between SNNs and NLP within the neuromorphic computing domain, in order to address many challenges in this domain, including computational complexity and power consumption. Our proposed model would not only enhance the capabilities of sentiment analysis but also contribute to the advancement of brain-inspired computing. Our proposed model could be utilized in other resource-constrained and low-power applications, such as robotics, autonomous, and smart systems." @default.
- W4386475999 created "2023-09-07" @default.
- W4386475999 creator A5087927873 @default.
- W4386475999 creator A5091607086 @default.
- W4386475999 date "2023-09-06" @default.
- W4386475999 modified "2023-10-17" @default.
- W4386475999 title "Neuromorphic Sentiment Analysis Using Spiking Neural Networks" @default.
- W4386475999 cites W101771737 @default.
- W4386475999 cites W1497599289 @default.
- W4386475999 cites W1570411240 @default.
- W4386475999 cites W159193778 @default.
- W4386475999 cites W1593079125 @default.
- W4386475999 cites W1607240981 @default.
- W4386475999 cites W1645800954 @default.
- W4386475999 cites W1967491419 @default.
- W4386475999 cites W1985940938 @default.
- W4386475999 cites W2000552960 @default.
- W4386475999 cites W2007130816 @default.
- W4386475999 cites W2109596721 @default.
- W4386475999 cites W2112683930 @default.
- W4386475999 cites W2130360162 @default.
- W4386475999 cites W2140886546 @default.
- W4386475999 cites W2153041354 @default.
- W4386475999 cites W2157239334 @default.
- W4386475999 cites W2159951683 @default.
- W4386475999 cites W2163630896 @default.
- W4386475999 cites W2164653071 @default.
- W4386475999 cites W2313320147 @default.
- W4386475999 cites W2316299058 @default.
- W4386475999 cites W2482047908 @default.
- W4386475999 cites W2513148968 @default.
- W4386475999 cites W2548021123 @default.
- W4386475999 cites W2548549368 @default.
- W4386475999 cites W2591049338 @default.
- W4386475999 cites W2775079417 @default.
- W4386475999 cites W2775855651 @default.
- W4386475999 cites W2783525259 @default.
- W4386475999 cites W2790788952 @default.
- W4386475999 cites W2800613970 @default.
- W4386475999 cites W2888850715 @default.
- W4386475999 cites W2894792791 @default.
- W4386475999 cites W2896254771 @default.
- W4386475999 cites W2898350988 @default.
- W4386475999 cites W2963157821 @default.
- W4386475999 cites W2964338223 @default.
- W4386475999 cites W2970585112 @default.
- W4386475999 cites W2974328520 @default.
- W4386475999 cites W3006542811 @default.
- W4386475999 cites W3090409650 @default.
- W4386475999 cites W3101210313 @default.
- W4386475999 cites W3102944297 @default.
- W4386475999 cites W3112669719 @default.
- W4386475999 cites W3153113868 @default.
- W4386475999 cites W3169689544 @default.
- W4386475999 cites W3199545630 @default.
- W4386475999 cites W3201524232 @default.
- W4386475999 cites W4223993587 @default.
- W4386475999 cites W4229456824 @default.
- W4386475999 cites W4238614602 @default.
- W4386475999 cites W4239107643 @default.
- W4386475999 cites W4283749998 @default.
- W4386475999 cites W4311224401 @default.
- W4386475999 cites W4362608768 @default.
- W4386475999 doi "https://doi.org/10.3390/s23187701" @default.
- W4386475999 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/37765758" @default.
- W4386475999 hasPublicationYear "2023" @default.
- W4386475999 type Work @default.
- W4386475999 citedByCount "0" @default.
- W4386475999 crossrefType "journal-article" @default.
- W4386475999 hasAuthorship W4386475999A5087927873 @default.
- W4386475999 hasAuthorship W4386475999A5091607086 @default.
- W4386475999 hasBestOaLocation W43864759991 @default.
- W4386475999 hasConcept C108583219 @default.
- W4386475999 hasConcept C11731999 @default.
- W4386475999 hasConcept C118524514 @default.
- W4386475999 hasConcept C119857082 @default.
- W4386475999 hasConcept C151927369 @default.
- W4386475999 hasConcept C15286952 @default.
- W4386475999 hasConcept C154945302 @default.
- W4386475999 hasConcept C34413123 @default.
- W4386475999 hasConcept C41008148 @default.
- W4386475999 hasConcept C50644808 @default.
- W4386475999 hasConcept C90509273 @default.
- W4386475999 hasConceptScore W4386475999C108583219 @default.
- W4386475999 hasConceptScore W4386475999C11731999 @default.
- W4386475999 hasConceptScore W4386475999C118524514 @default.
- W4386475999 hasConceptScore W4386475999C119857082 @default.
- W4386475999 hasConceptScore W4386475999C151927369 @default.
- W4386475999 hasConceptScore W4386475999C15286952 @default.
- W4386475999 hasConceptScore W4386475999C154945302 @default.
- W4386475999 hasConceptScore W4386475999C34413123 @default.
- W4386475999 hasConceptScore W4386475999C41008148 @default.
- W4386475999 hasConceptScore W4386475999C50644808 @default.
- W4386475999 hasConceptScore W4386475999C90509273 @default.
- W4386475999 hasIssue "18" @default.
- W4386475999 hasLocation W43864759991 @default.
- W4386475999 hasLocation W43864759992 @default.
- W4386475999 hasOpenAccess W4386475999 @default.