Matches in SemOpenAlex for { <https://semopenalex.org/work/W4361861240> ?p ?o ?g. }
Showing items 1 to 76 of
76
with 100 items per page.
- W4361861240 endingPage "1" @default.
- W4361861240 startingPage "1" @default.
- W4361861240 abstract "Emotions are states of readiness in the mind that result from evaluations of one's own thinking or events. Although almost all of the important events in our lives are marked by emotions, the nature, causes, and effects of emotions are some of the least understood parts of the human experience. Emotion recognition is playing a promising role in the domains of human-computer interaction and artificial intelligence. A human's emotions can be detected using a variety of methods, including facial gestures, blood pressure, body movements, heart rate, and textual data. From an application standpoint, the ability to identify human emotions in text is becoming more and more crucial in computational linguistics. In this work, we present a classification methodology based on deep neural networks. The Bi-directional Gated Recurrent Unit (Bi-GRU) employed here demonstrates its effectiveness on the Multimodal Emotion Lines Dataset (MELD) when compared to Convolutional Neural Networks (CNN) and Long Short-Term Memory (LSTM). For word encoding, a comparison of three pre-trained word embeddings namely Glove, Word2Vec, and fastText is made. The findings from the MELD corpus support the conclusion that fastText is the best word embedding for the proposed Bi-GRU model. The experiment utilized the glove.6B.300d vector space. It consists of two million word representations in 300 dimensions trained on Common Crawl with sub-word information (600 billion tokens). The accuracy scores of GloVe, Word2Vec, and fastText (300 dimensions each) are tabulated and studied in order to highlight the improved results with fastText on the MELD dataset tested. It is observed that the Bidirectional Gated Recurrent Unit (Bi-GRU) with fastText word embedding outperforms GloVe and Word2Vec with an accuracy of 79.7%." @default.
- W4361861240 created "2023-04-05" @default.
- W4361861240 creator A5018397805 @default.
- W4361861240 creator A5032981880 @default.
- W4361861240 creator A5051140533 @default.
- W4361861240 creator A5059221699 @default.
- W4361861240 creator A5069334579 @default.
- W4361861240 creator A5081878874 @default.
- W4361861240 date "2022-01-01" @default.
- W4361861240 modified "2023-09-27" @default.
- W4361861240 title "Text emotion recognition using fast text word embedding in bi-directional gated recurrent unit" @default.
- W4361861240 cites W2062632672 @default.
- W4361861240 cites W2095234413 @default.
- W4361861240 cites W2523148522 @default.
- W4361861240 cites W2741447225 @default.
- W4361861240 cites W2801511296 @default.
- W4361861240 cites W2963199188 @default.
- W4361861240 cites W3019220897 @default.
- W4361861240 cites W3201154085 @default.
- W4361861240 cites W4293261951 @default.
- W4361861240 cites W4294367149 @default.
- W4361861240 doi "https://doi.org/10.26634/jit.11.4.19119" @default.
- W4361861240 hasPublicationYear "2022" @default.
- W4361861240 type Work @default.
- W4361861240 citedByCount "0" @default.
- W4361861240 crossrefType "journal-article" @default.
- W4361861240 hasAuthorship W4361861240A5018397805 @default.
- W4361861240 hasAuthorship W4361861240A5032981880 @default.
- W4361861240 hasAuthorship W4361861240A5051140533 @default.
- W4361861240 hasAuthorship W4361861240A5059221699 @default.
- W4361861240 hasAuthorship W4361861240A5069334579 @default.
- W4361861240 hasAuthorship W4361861240A5081878874 @default.
- W4361861240 hasConcept C138885662 @default.
- W4361861240 hasConcept C154945302 @default.
- W4361861240 hasConcept C204321447 @default.
- W4361861240 hasConcept C207347870 @default.
- W4361861240 hasConcept C2776461190 @default.
- W4361861240 hasConcept C2777462759 @default.
- W4361861240 hasConcept C28490314 @default.
- W4361861240 hasConcept C41008148 @default.
- W4361861240 hasConcept C41608201 @default.
- W4361861240 hasConcept C41895202 @default.
- W4361861240 hasConcept C81363708 @default.
- W4361861240 hasConcept C90805587 @default.
- W4361861240 hasConceptScore W4361861240C138885662 @default.
- W4361861240 hasConceptScore W4361861240C154945302 @default.
- W4361861240 hasConceptScore W4361861240C204321447 @default.
- W4361861240 hasConceptScore W4361861240C207347870 @default.
- W4361861240 hasConceptScore W4361861240C2776461190 @default.
- W4361861240 hasConceptScore W4361861240C2777462759 @default.
- W4361861240 hasConceptScore W4361861240C28490314 @default.
- W4361861240 hasConceptScore W4361861240C41008148 @default.
- W4361861240 hasConceptScore W4361861240C41608201 @default.
- W4361861240 hasConceptScore W4361861240C41895202 @default.
- W4361861240 hasConceptScore W4361861240C81363708 @default.
- W4361861240 hasConceptScore W4361861240C90805587 @default.
- W4361861240 hasIssue "4" @default.
- W4361861240 hasLocation W43618612401 @default.
- W4361861240 hasOpenAccess W4361861240 @default.
- W4361861240 hasPrimaryLocation W43618612401 @default.
- W4361861240 hasRelatedWork W2747424680 @default.
- W4361861240 hasRelatedWork W2891550009 @default.
- W4361861240 hasRelatedWork W2952874106 @default.
- W4361861240 hasRelatedWork W2953749697 @default.
- W4361861240 hasRelatedWork W3036348210 @default.
- W4361861240 hasRelatedWork W3046869600 @default.
- W4361861240 hasRelatedWork W4246455502 @default.
- W4361861240 hasRelatedWork W4298857951 @default.
- W4361861240 hasRelatedWork W4313247739 @default.
- W4361861240 hasRelatedWork W4313384562 @default.
- W4361861240 hasVolume "11" @default.
- W4361861240 isParatext "false" @default.
- W4361861240 isRetracted "false" @default.
- W4361861240 workType "article" @default.