Matches in SemOpenAlex for { <https://semopenalex.org/work/W2022964551> ?p ?o ?g. }
- W2022964551 endingPage "15581" @default.
- W2022964551 startingPage "15549" @default.
- W2022964551 abstract "In this paper, a multimodal user-emotion detection system for social robots is presented. This system is intended to be used during human–robot interaction, and it is integrated as part of the overall interaction system of the robot: the Robotics Dialog System (RDS). Two modes are used to detect emotions: the voice and face expression analysis. In order to analyze the voice of the user, a new component has been developed: Gender and Emotion Voice Analysis (GEVA), which is written using the Chuck language. For emotion detection in facial expressions, the system, Gender and Emotion Facial Analysis (GEFA), has been also developed. This last system integrates two third-party solutions: Sophisticated High-speed Object Recognition Engine (SHORE) and Computer Expression Recognition Toolbox (CERT). Once these new components (GEVA and GEFA) give their results, a decision rule is applied in order to combine the information given by both of them. The result of this rule, the detected emotion, is integrated into the dialog system through communicative acts. Hence, each communicative act gives, among other things, the detected emotion of the user to the RDS so it can adapt its strategy in order to get a greater satisfaction degree during the human–robot dialog. Each of the new components, GEVA and GEFA, can also be used individually. Moreover, they are integrated with the robotic control platform ROS (Robot Operating System). Several experiments with real users were performed to determine the accuracy of each component and to set the final decision rule. The results obtained from applying this decision rule in these experiments show a high success rate in automatic user emotion recognition, improving the results given by the two information channels (audio and visual) separately." @default.
- W2022964551 created "2016-06-24" @default.
- W2022964551 creator A5028526367 @default.
- W2022964551 creator A5045724042 @default.
- W2022964551 creator A5055889709 @default.
- W2022964551 creator A5060557478 @default.
- W2022964551 creator A5077065409 @default.
- W2022964551 date "2013-11-14" @default.
- W2022964551 modified "2023-10-10" @default.
- W2022964551 title "A Multimodal Emotion Detection System during Human–Robot Interaction" @default.
- W2022964551 cites W2003238582 @default.
- W2022964551 cites W2007450092 @default.
- W2022964551 cites W2012614016 @default.
- W2022964551 cites W2015463594 @default.
- W2022964551 cites W2018658329 @default.
- W2022964551 cites W2032254851 @default.
- W2022964551 cites W2038199522 @default.
- W2022964551 cites W2082026714 @default.
- W2022964551 cites W2085709640 @default.
- W2022964551 cites W2091425152 @default.
- W2022964551 cites W2096076356 @default.
- W2022964551 cites W2108445559 @default.
- W2022964551 cites W2137647199 @default.
- W2022964551 cites W2141660036 @default.
- W2022964551 cites W2145310492 @default.
- W2022964551 cites W2156503193 @default.
- W2022964551 cites W2159017231 @default.
- W2022964551 cites W2161233243 @default.
- W2022964551 cites W2167557160 @default.
- W2022964551 cites W3097096317 @default.
- W2022964551 cites W4249999851 @default.
- W2022964551 cites W4255371740 @default.
- W2022964551 cites W74190392 @default.
- W2022964551 doi "https://doi.org/10.3390/s131115549" @default.
- W2022964551 hasPubMedCentralId "https://www.ncbi.nlm.nih.gov/pmc/articles/3871074" @default.
- W2022964551 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/24240598" @default.
- W2022964551 hasPublicationYear "2013" @default.
- W2022964551 type Work @default.
- W2022964551 sameAs 2022964551 @default.
- W2022964551 citedByCount "74" @default.
- W2022964551 countsByYear W20229645512014 @default.
- W2022964551 countsByYear W20229645512015 @default.
- W2022964551 countsByYear W20229645512016 @default.
- W2022964551 countsByYear W20229645512017 @default.
- W2022964551 countsByYear W20229645512018 @default.
- W2022964551 countsByYear W20229645512019 @default.
- W2022964551 countsByYear W20229645512020 @default.
- W2022964551 countsByYear W20229645512021 @default.
- W2022964551 countsByYear W20229645512022 @default.
- W2022964551 countsByYear W20229645512023 @default.
- W2022964551 crossrefType "journal-article" @default.
- W2022964551 hasAuthorship W2022964551A5028526367 @default.
- W2022964551 hasAuthorship W2022964551A5045724042 @default.
- W2022964551 hasAuthorship W2022964551A5055889709 @default.
- W2022964551 hasAuthorship W2022964551A5060557478 @default.
- W2022964551 hasAuthorship W2022964551A5077065409 @default.
- W2022964551 hasBestOaLocation W20229645511 @default.
- W2022964551 hasConcept C107457646 @default.
- W2022964551 hasConcept C121332964 @default.
- W2022964551 hasConcept C136764020 @default.
- W2022964551 hasConcept C145460709 @default.
- W2022964551 hasConcept C154945302 @default.
- W2022964551 hasConcept C162947575 @default.
- W2022964551 hasConcept C168167062 @default.
- W2022964551 hasConcept C173853756 @default.
- W2022964551 hasConcept C190954187 @default.
- W2022964551 hasConcept C195704467 @default.
- W2022964551 hasConcept C199360897 @default.
- W2022964551 hasConcept C19966478 @default.
- W2022964551 hasConcept C2777655017 @default.
- W2022964551 hasConcept C40346341 @default.
- W2022964551 hasConcept C41008148 @default.
- W2022964551 hasConcept C65401140 @default.
- W2022964551 hasConcept C90509273 @default.
- W2022964551 hasConcept C97355855 @default.
- W2022964551 hasConceptScore W2022964551C107457646 @default.
- W2022964551 hasConceptScore W2022964551C121332964 @default.
- W2022964551 hasConceptScore W2022964551C136764020 @default.
- W2022964551 hasConceptScore W2022964551C145460709 @default.
- W2022964551 hasConceptScore W2022964551C154945302 @default.
- W2022964551 hasConceptScore W2022964551C162947575 @default.
- W2022964551 hasConceptScore W2022964551C168167062 @default.
- W2022964551 hasConceptScore W2022964551C173853756 @default.
- W2022964551 hasConceptScore W2022964551C190954187 @default.
- W2022964551 hasConceptScore W2022964551C195704467 @default.
- W2022964551 hasConceptScore W2022964551C199360897 @default.
- W2022964551 hasConceptScore W2022964551C19966478 @default.
- W2022964551 hasConceptScore W2022964551C2777655017 @default.
- W2022964551 hasConceptScore W2022964551C40346341 @default.
- W2022964551 hasConceptScore W2022964551C41008148 @default.
- W2022964551 hasConceptScore W2022964551C65401140 @default.
- W2022964551 hasConceptScore W2022964551C90509273 @default.
- W2022964551 hasConceptScore W2022964551C97355855 @default.
- W2022964551 hasIssue "11" @default.
- W2022964551 hasLocation W20229645511 @default.
- W2022964551 hasLocation W20229645512 @default.
- W2022964551 hasLocation W20229645513 @default.
- W2022964551 hasLocation W20229645514 @default.