Matches in SemOpenAlex for { <https://semopenalex.org/work/W4200162263> ?p ?o ?g. }
- W4200162263 abstract "Life in modern societies is fast-paced and full of stress-inducing demands. The development of stress monitoring methods is a growing area of research due to the personal and economic advantages that timely detection provides. Studies have shown that speech-based features can be utilised to robustly predict several physiological markers of stress, including emotional state, continuous heart rate, and the stress hormone, cortisol. In this contribution, we extend previous works by the authors, utilising three German language corpora including more than 100 subjects undergoing a Trier Social Stress Test protocol. We present cross-corpus and transfer learning results which explore the efficacy of the speech signal to predict three physiological markers of stress—sequentially measured saliva-based cortisol, continuous heart rate as beats per minute (BPM), and continuous respiration. For this, we extract several features from audio as well as video and apply various machine learning architectures, including a temporal context-based Long Short-Term Memory Recurrent Neural Network (LSTM-RNN). For the task of predicting cortisol levels from speech, deep learning improves on results obtained by conventional support vector regression—yielding a Spearman correlation coefficient ( ρ ) of 0.770 and 0.698 for cortisol measurements taken 10 and 20 min after the stress period for the two corpora applicable—showing that audio features alone are sufficient for predicting cortisol, with audiovisual fusion to an extent improving such results. We also obtain a Root Mean Square Error (RMSE) of 38 and 22 BPM for continuous heart rate prediction on the two corpora where this information is available, and a normalised RMSE (NRMSE) of 0.120 for respiration prediction (−10: 10). Both of these continuous physiological signals show to be highly effective markers of stress (based on cortisol grouping analysis), both when available as ground truth and when predicted using speech. This contribution opens up new avenues for future exploration of these signals as proxies for stress in naturalistic settings." @default.
- W4200162263 created "2021-12-31" @default.
- W4200162263 creator A5012240826 @default.
- W4200162263 creator A5017889477 @default.
- W4200162263 creator A5023099133 @default.
- W4200162263 creator A5030568071 @default.
- W4200162263 creator A5030891676 @default.
- W4200162263 creator A5034102826 @default.
- W4200162263 creator A5041271073 @default.
- W4200162263 creator A5043060302 @default.
- W4200162263 creator A5045674246 @default.
- W4200162263 creator A5058905510 @default.
- W4200162263 creator A5082988844 @default.
- W4200162263 creator A5086359930 @default.
- W4200162263 date "2021-12-06" @default.
- W4200162263 modified "2023-10-11" @default.
- W4200162263 title "An Evaluation of Speech-Based Recognition of Emotional and Physiological Markers of Stress" @default.
- W4200162263 cites W1566577700 @default.
- W4200162263 cites W179777611 @default.
- W4200162263 cites W1834627138 @default.
- W4200162263 cites W1969798264 @default.
- W4200162263 cites W2011336106 @default.
- W4200162263 cites W2011604423 @default.
- W4200162263 cites W2027615005 @default.
- W4200162263 cites W2030956023 @default.
- W4200162263 cites W2049876602 @default.
- W4200162263 cites W2056962346 @default.
- W4200162263 cites W2091413411 @default.
- W4200162263 cites W2104094955 @default.
- W4200162263 cites W2112400936 @default.
- W4200162263 cites W2112812932 @default.
- W4200162263 cites W2113947213 @default.
- W4200162263 cites W2128521063 @default.
- W4200162263 cites W2144961120 @default.
- W4200162263 cites W2149628368 @default.
- W4200162263 cites W2162134198 @default.
- W4200162263 cites W2171801645 @default.
- W4200162263 cites W2239141610 @default.
- W4200162263 cites W2353776729 @default.
- W4200162263 cites W2486828091 @default.
- W4200162263 cites W2591583354 @default.
- W4200162263 cites W2731836491 @default.
- W4200162263 cites W2746419079 @default.
- W4200162263 cites W2777828547 @default.
- W4200162263 cites W2791010381 @default.
- W4200162263 cites W2885806496 @default.
- W4200162263 cites W2888956240 @default.
- W4200162263 cites W2890929258 @default.
- W4200162263 cites W2918153055 @default.
- W4200162263 cites W2972961215 @default.
- W4200162263 cites W2974802536 @default.
- W4200162263 cites W2980977859 @default.
- W4200162263 cites W2991435809 @default.
- W4200162263 cites W2995892183 @default.
- W4200162263 cites W3005270267 @default.
- W4200162263 cites W3009849862 @default.
- W4200162263 cites W3096310251 @default.
- W4200162263 cites W3099703892 @default.
- W4200162263 cites W3101998545 @default.
- W4200162263 cites W3119335825 @default.
- W4200162263 cites W3120424088 @default.
- W4200162263 cites W3124971900 @default.
- W4200162263 cites W3126247264 @default.
- W4200162263 cites W3154610942 @default.
- W4200162263 cites W3159245381 @default.
- W4200162263 cites W3196215173 @default.
- W4200162263 cites W3205733239 @default.
- W4200162263 cites W3206776536 @default.
- W4200162263 doi "https://doi.org/10.3389/fcomp.2021.750284" @default.
- W4200162263 hasPublicationYear "2021" @default.
- W4200162263 type Work @default.
- W4200162263 citedByCount "11" @default.
- W4200162263 countsByYear W42001622632022 @default.
- W4200162263 countsByYear W42001622632023 @default.
- W4200162263 crossrefType "journal-article" @default.
- W4200162263 hasAuthorship W4200162263A5012240826 @default.
- W4200162263 hasAuthorship W4200162263A5017889477 @default.
- W4200162263 hasAuthorship W4200162263A5023099133 @default.
- W4200162263 hasAuthorship W4200162263A5030568071 @default.
- W4200162263 hasAuthorship W4200162263A5030891676 @default.
- W4200162263 hasAuthorship W4200162263A5034102826 @default.
- W4200162263 hasAuthorship W4200162263A5041271073 @default.
- W4200162263 hasAuthorship W4200162263A5043060302 @default.
- W4200162263 hasAuthorship W4200162263A5045674246 @default.
- W4200162263 hasAuthorship W4200162263A5058905510 @default.
- W4200162263 hasAuthorship W4200162263A5082988844 @default.
- W4200162263 hasAuthorship W4200162263A5086359930 @default.
- W4200162263 hasBestOaLocation W42001622631 @default.
- W4200162263 hasConcept C104317684 @default.
- W4200162263 hasConcept C105795698 @default.
- W4200162263 hasConcept C119857082 @default.
- W4200162263 hasConcept C12267149 @default.
- W4200162263 hasConcept C126838900 @default.
- W4200162263 hasConcept C138885662 @default.
- W4200162263 hasConcept C139945424 @default.
- W4200162263 hasConcept C151730666 @default.
- W4200162263 hasConcept C154945302 @default.
- W4200162263 hasConcept C185592680 @default.
- W4200162263 hasConcept C21036866 @default.
- W4200162263 hasConcept C2777953023 @default.