Matches in SemOpenAlex for { <https://semopenalex.org/work/W4367011707> ?p ?o ?g. }
- W4367011707 endingPage "109" @default.
- W4367011707 startingPage "99" @default.
- W4367011707 abstract "Understanding human emotion is vital to communicate effectively with others, monitor patients, analyse behaviour, and keep an eye on those who are vulnerable. Emotion recognition is essential to achieve a complete human-machine interoperability experience. Artificial intelligence, mainly machine learning (ML), have been used in recent years to improve the model for recognising emotions from a single type of data. A multimodal system has been proposed that uses text, facial expressions, and speech signals to identify emotions in this work. The MobileNet architecture is used to predict emotion from facial expressions, and different ML classifiers are used to predict emotion from text and speech signals in the proposed model. The Facial Expression Recognition 2013 (FER2013) dataset has been used to recognise emotion from facial expressions, whilst the Interactive Emotional Dyadic Motion Capture (IEMOCAP) dataset was used for both text and speech emotion recognition. The proposed ensemble technique consisting of random forest, extreme gradient boosting, and multi-layer perceptron achieves an accuracy of 70.67%, which is better than the unimodal approaches used." @default.
- W4367011707 created "2023-04-27" @default.
- W4367011707 creator A5012611827 @default.
- W4367011707 creator A5019603391 @default.
- W4367011707 creator A5024256887 @default.
- W4367011707 creator A5027525633 @default.
- W4367011707 creator A5034725408 @default.
- W4367011707 creator A5057247142 @default.
- W4367011707 date "2023-01-01" @default.
- W4367011707 modified "2023-09-30" @default.
- W4367011707 title "Towards Machine Learning-Based Emotion Recognition from Multimodal Data" @default.
- W4367011707 cites W2707551695 @default.
- W4367011707 cites W2747664154 @default.
- W4367011707 cites W2757176511 @default.
- W4367011707 cites W2768956845 @default.
- W4367011707 cites W2790854021 @default.
- W4367011707 cites W2793425397 @default.
- W4367011707 cites W2796830519 @default.
- W4367011707 cites W2810418809 @default.
- W4367011707 cites W2908671501 @default.
- W4367011707 cites W2921997955 @default.
- W4367011707 cites W2953590561 @default.
- W4367011707 cites W2961723205 @default.
- W4367011707 cites W2963800675 @default.
- W4367011707 cites W2973110385 @default.
- W4367011707 cites W3089526972 @default.
- W4367011707 cites W3090519764 @default.
- W4367011707 cites W3091643389 @default.
- W4367011707 cites W3091860120 @default.
- W4367011707 cites W3127786468 @default.
- W4367011707 cites W3151064326 @default.
- W4367011707 cites W3159597990 @default.
- W4367011707 cites W3159875375 @default.
- W4367011707 cites W3176221632 @default.
- W4367011707 cites W3184416411 @default.
- W4367011707 cites W3186593818 @default.
- W4367011707 cites W3192818153 @default.
- W4367011707 cites W3194074498 @default.
- W4367011707 cites W3199297386 @default.
- W4367011707 cites W3199364093 @default.
- W4367011707 cites W3217064948 @default.
- W4367011707 cites W4200166643 @default.
- W4367011707 cites W4206440813 @default.
- W4367011707 cites W4206979879 @default.
- W4367011707 cites W4207012013 @default.
- W4367011707 cites W4226267190 @default.
- W4367011707 doi "https://doi.org/10.1007/978-981-19-5191-6_9" @default.
- W4367011707 hasPublicationYear "2023" @default.
- W4367011707 type Work @default.
- W4367011707 citedByCount "0" @default.
- W4367011707 crossrefType "book-chapter" @default.
- W4367011707 hasAuthorship W4367011707A5012611827 @default.
- W4367011707 hasAuthorship W4367011707A5019603391 @default.
- W4367011707 hasAuthorship W4367011707A5024256887 @default.
- W4367011707 hasAuthorship W4367011707A5027525633 @default.
- W4367011707 hasAuthorship W4367011707A5034725408 @default.
- W4367011707 hasAuthorship W4367011707A5057247142 @default.
- W4367011707 hasConcept C119857082 @default.
- W4367011707 hasConcept C12267149 @default.
- W4367011707 hasConcept C154945302 @default.
- W4367011707 hasConcept C169258074 @default.
- W4367011707 hasConcept C179717631 @default.
- W4367011707 hasConcept C195704467 @default.
- W4367011707 hasConcept C206310091 @default.
- W4367011707 hasConcept C2777438025 @default.
- W4367011707 hasConcept C28490314 @default.
- W4367011707 hasConcept C2988148770 @default.
- W4367011707 hasConcept C41008148 @default.
- W4367011707 hasConcept C45942800 @default.
- W4367011707 hasConcept C46686674 @default.
- W4367011707 hasConcept C50644808 @default.
- W4367011707 hasConcept C60908668 @default.
- W4367011707 hasConcept C6438553 @default.
- W4367011707 hasConceptScore W4367011707C119857082 @default.
- W4367011707 hasConceptScore W4367011707C12267149 @default.
- W4367011707 hasConceptScore W4367011707C154945302 @default.
- W4367011707 hasConceptScore W4367011707C169258074 @default.
- W4367011707 hasConceptScore W4367011707C179717631 @default.
- W4367011707 hasConceptScore W4367011707C195704467 @default.
- W4367011707 hasConceptScore W4367011707C206310091 @default.
- W4367011707 hasConceptScore W4367011707C2777438025 @default.
- W4367011707 hasConceptScore W4367011707C28490314 @default.
- W4367011707 hasConceptScore W4367011707C2988148770 @default.
- W4367011707 hasConceptScore W4367011707C41008148 @default.
- W4367011707 hasConceptScore W4367011707C45942800 @default.
- W4367011707 hasConceptScore W4367011707C46686674 @default.
- W4367011707 hasConceptScore W4367011707C50644808 @default.
- W4367011707 hasConceptScore W4367011707C60908668 @default.
- W4367011707 hasConceptScore W4367011707C6438553 @default.
- W4367011707 hasLocation W43670117071 @default.
- W4367011707 hasOpenAccess W4367011707 @default.
- W4367011707 hasPrimaryLocation W43670117071 @default.
- W4367011707 hasRelatedWork W2508565453 @default.
- W4367011707 hasRelatedWork W2750664433 @default.
- W4367011707 hasRelatedWork W2979979539 @default.
- W4367011707 hasRelatedWork W3209934268 @default.
- W4367011707 hasRelatedWork W4206256357 @default.
- W4367011707 hasRelatedWork W4238962735 @default.