Matches in SemOpenAlex for { <https://semopenalex.org/work/W4386703014> ?p ?o ?g. }
- W4386703014 endingPage "e0290564" @default.
- W4386703014 startingPage "e0290564" @default.
- W4386703014 abstract "Emotion recognition is key to interpersonal communication and to human-machine interaction. Body expression may contribute to emotion recognition, but most past studies focused on a few motions, limiting accurate recognition. Moreover, emotions in most previous research were acted out, resulting in non-natural motion, which is unapplicable in reality. We present an approach for emotion recognition based on body motion in naturalistic settings, examining authentic emotions, natural movement, and a broad collection of motion parameters. A lab experiment using 24 participants manipulated participants' emotions using pretested movies into five conditions: happiness, relaxation, fear, sadness, and emotionally-neutral. Emotion was manipulated within subjects, with fillers in between and a counterbalanced order. A motion capture system measured posture and motion during standing and walking; a force plate measured center of pressure location. Traditional statistics revealed nonsignificant effects of emotions on most motion parameters; only 7 of 229 parameters demonstrate significant effects. Most significant effects are in parameters representing postural control during standing, which is consistent with past studies. Yet, the few significant effects suggest that it is impossible to recognize emotions based on a single motion parameter. We therefore developed machine learning models to classify emotions using a collection of parameters, and examined six models: k-nearest neighbors, decision tree, logistic regression, and the support vector machine with radial base function and linear and polynomial functions. The decision tree using 25 parameters provided the highest average accuracy (45.8%), more than twice the random guess for five conditions, which advances past studies demonstrating comparable accuracies, due to our naturalistic setting. This research suggests that machine learning models are valuable for emotion recognition in reality and lays the foundation for further progress in emotion recognition models, informing the development of recognition devices (e.g., depth camera), to be used in home-setting human-machine interactions." @default.
- W4386703014 created "2023-09-14" @default.
- W4386703014 creator A5014859786 @default.
- W4386703014 creator A5022357481 @default.
- W4386703014 creator A5022664326 @default.
- W4386703014 creator A5058140164 @default.
- W4386703014 date "2023-09-13" @default.
- W4386703014 modified "2023-09-30" @default.
- W4386703014 title "Emotion and motion: Toward emotion recognition based on standing and walking" @default.
- W4386703014 cites W1536165553 @default.
- W4386703014 cites W1965306520 @default.
- W4386703014 cites W1969850025 @default.
- W4386703014 cites W1977841100 @default.
- W4386703014 cites W1984867139 @default.
- W4386703014 cites W1989486988 @default.
- W4386703014 cites W1996603663 @default.
- W4386703014 cites W2003556750 @default.
- W4386703014 cites W2015712584 @default.
- W4386703014 cites W2022625505 @default.
- W4386703014 cites W2022686119 @default.
- W4386703014 cites W2023397245 @default.
- W4386703014 cites W2024826018 @default.
- W4386703014 cites W2031174057 @default.
- W4386703014 cites W2051991137 @default.
- W4386703014 cites W2058518679 @default.
- W4386703014 cites W2060969870 @default.
- W4386703014 cites W2066725723 @default.
- W4386703014 cites W2069002413 @default.
- W4386703014 cites W2070560544 @default.
- W4386703014 cites W2078951341 @default.
- W4386703014 cites W2082981966 @default.
- W4386703014 cites W2087347434 @default.
- W4386703014 cites W2088421521 @default.
- W4386703014 cites W2088661631 @default.
- W4386703014 cites W2093049403 @default.
- W4386703014 cites W2102653827 @default.
- W4386703014 cites W2105496692 @default.
- W4386703014 cites W2107114452 @default.
- W4386703014 cites W2107256573 @default.
- W4386703014 cites W2110667507 @default.
- W4386703014 cites W2112295121 @default.
- W4386703014 cites W2112937514 @default.
- W4386703014 cites W2120146197 @default.
- W4386703014 cites W2122111042 @default.
- W4386703014 cites W2122210511 @default.
- W4386703014 cites W2130684172 @default.
- W4386703014 cites W2131123711 @default.
- W4386703014 cites W2132343319 @default.
- W4386703014 cites W2136119880 @default.
- W4386703014 cites W2138867073 @default.
- W4386703014 cites W2139981301 @default.
- W4386703014 cites W2141114152 @default.
- W4386703014 cites W2148905283 @default.
- W4386703014 cites W2149628368 @default.
- W4386703014 cites W2150433388 @default.
- W4386703014 cites W2164144795 @default.
- W4386703014 cites W2168031754 @default.
- W4386703014 cites W2168958809 @default.
- W4386703014 cites W2189399934 @default.
- W4386703014 cites W2201722321 @default.
- W4386703014 cites W2218725013 @default.
- W4386703014 cites W2253209761 @default.
- W4386703014 cites W2282131551 @default.
- W4386703014 cites W2292595953 @default.
- W4386703014 cites W2482658007 @default.
- W4386703014 cites W2544846410 @default.
- W4386703014 cites W2611932403 @default.
- W4386703014 cites W2686305746 @default.
- W4386703014 cites W2766038777 @default.
- W4386703014 cites W2783460438 @default.
- W4386703014 cites W2878990729 @default.
- W4386703014 cites W2891489434 @default.
- W4386703014 cites W2891965441 @default.
- W4386703014 cites W2913485918 @default.
- W4386703014 cites W293599430 @default.
- W4386703014 cites W2946775517 @default.
- W4386703014 cites W3085973397 @default.
- W4386703014 cites W3092746285 @default.
- W4386703014 cites W3179934335 @default.
- W4386703014 cites W3180095173 @default.
- W4386703014 cites W3188517539 @default.
- W4386703014 cites W4210993929 @default.
- W4386703014 cites W4236137412 @default.
- W4386703014 cites W4241355266 @default.
- W4386703014 cites W4256561644 @default.
- W4386703014 cites W4292994367 @default.
- W4386703014 doi "https://doi.org/10.1371/journal.pone.0290564" @default.
- W4386703014 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/37703239" @default.
- W4386703014 hasPublicationYear "2023" @default.
- W4386703014 type Work @default.
- W4386703014 citedByCount "0" @default.
- W4386703014 crossrefType "journal-article" @default.
- W4386703014 hasAuthorship W4386703014A5014859786 @default.
- W4386703014 hasAuthorship W4386703014A5022357481 @default.
- W4386703014 hasAuthorship W4386703014A5022664326 @default.
- W4386703014 hasAuthorship W4386703014A5058140164 @default.
- W4386703014 hasBestOaLocation W43867030141 @default.
- W4386703014 hasConcept C104114177 @default.