Matches in SemOpenAlex for { <https://semopenalex.org/work/W3133793003> ?p ?o ?g. }
- W3133793003 endingPage "1888" @default.
- W3133793003 startingPage "1888" @default.
- W3133793003 abstract "Many speech emotion recognition systems have been designed using different features and classification methods. Still, there is a lack of knowledge and reasoning regarding the underlying speech characteristics and processing, i.e., how basic characteristics, methods, and settings affect the accuracy, to what extent, etc. This study is to extend physical perspective on speech emotion recognition by analyzing basic speech characteristics and modeling methods, e.g., time characteristics (segmentation, window types, and classification regions—lengths and overlaps), frequency ranges, frequency scales, processing of whole speech (spectrograms), vocal tract (filter banks, linear prediction coefficient (LPC) modeling), and excitation (inverse LPC filtering) signals, magnitude and phase manipulations, cepstral features, etc. In the evaluation phase the state-of-the-art classification method and rigorous statistical tests were applied, namely N-fold cross validation, paired t-test, rank, and Pearson correlations. The results revealed several settings in a 75% accuracy range (seven emotions). The most successful methods were based on vocal tract features using psychoacoustic filter banks covering the 0–8 kHz frequency range. Well scoring are also spectrograms carrying vocal tract and excitation information. It was found that even basic processing like pre-emphasis, segmentation, magnitude modifications, etc., can dramatically affect the results. Most findings are robust by exhibiting strong correlations across tested databases." @default.
- W3133793003 created "2021-03-15" @default.
- W3133793003 creator A5009841250 @default.
- W3133793003 creator A5058393593 @default.
- W3133793003 creator A5072025174 @default.
- W3133793003 creator A5080968685 @default.
- W3133793003 date "2021-03-08" @default.
- W3133793003 modified "2023-09-25" @default.
- W3133793003 title "On the Speech Properties and Feature Extraction Methods in Speech Emotion Recognition" @default.
- W3133793003 cites W1523029986 @default.
- W3133793003 cites W1934410531 @default.
- W3133793003 cites W1972280480 @default.
- W3133793003 cites W2002311796 @default.
- W3133793003 cites W2060198345 @default.
- W3133793003 cites W2074788634 @default.
- W3133793003 cites W2076063813 @default.
- W3133793003 cites W2146334809 @default.
- W3133793003 cites W2165857685 @default.
- W3133793003 cites W2239141610 @default.
- W3133793003 cites W2289778796 @default.
- W3133793003 cites W2498293176 @default.
- W3133793003 cites W2564725949 @default.
- W3133793003 cites W2598207902 @default.
- W3133793003 cites W2602034649 @default.
- W3133793003 cites W2605176093 @default.
- W3133793003 cites W2766756589 @default.
- W3133793003 cites W2790854021 @default.
- W3133793003 cites W2793978228 @default.
- W3133793003 cites W2860968431 @default.
- W3133793003 cites W2883866324 @default.
- W3133793003 cites W2888650348 @default.
- W3133793003 cites W2888879454 @default.
- W3133793003 cites W2889866686 @default.
- W3133793003 cites W2899089627 @default.
- W3133793003 cites W2900550435 @default.
- W3133793003 cites W2946789944 @default.
- W3133793003 cites W2951975883 @default.
- W3133793003 cites W2981559298 @default.
- W3133793003 cites W2982395984 @default.
- W3133793003 cites W2991837863 @default.
- W3133793003 cites W3022013598 @default.
- W3133793003 cites W3084484668 @default.
- W3133793003 cites W3109961563 @default.
- W3133793003 doi "https://doi.org/10.3390/s21051888" @default.
- W3133793003 hasPubMedCentralId "https://www.ncbi.nlm.nih.gov/pmc/articles/7962835" @default.
- W3133793003 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/33800348" @default.
- W3133793003 hasPublicationYear "2021" @default.
- W3133793003 type Work @default.
- W3133793003 sameAs 3133793003 @default.
- W3133793003 citedByCount "11" @default.
- W3133793003 countsByYear W31337930032022 @default.
- W3133793003 countsByYear W31337930032023 @default.
- W3133793003 crossrefType "journal-article" @default.
- W3133793003 hasAuthorship W3133793003A5009841250 @default.
- W3133793003 hasAuthorship W3133793003A5058393593 @default.
- W3133793003 hasAuthorship W3133793003A5072025174 @default.
- W3133793003 hasAuthorship W3133793003A5080968685 @default.
- W3133793003 hasBestOaLocation W31337930031 @default.
- W3133793003 hasConcept C106131492 @default.
- W3133793003 hasConcept C138885662 @default.
- W3133793003 hasConcept C153180895 @default.
- W3133793003 hasConcept C154945302 @default.
- W3133793003 hasConcept C15744967 @default.
- W3133793003 hasConcept C159985019 @default.
- W3133793003 hasConcept C169760540 @default.
- W3133793003 hasConcept C192562407 @default.
- W3133793003 hasConcept C204323151 @default.
- W3133793003 hasConcept C26760741 @default.
- W3133793003 hasConcept C2776401178 @default.
- W3133793003 hasConcept C28490314 @default.
- W3133793003 hasConcept C31972630 @default.
- W3133793003 hasConcept C41008148 @default.
- W3133793003 hasConcept C41895202 @default.
- W3133793003 hasConcept C45273575 @default.
- W3133793003 hasConcept C47401133 @default.
- W3133793003 hasConcept C61328038 @default.
- W3133793003 hasConcept C88485024 @default.
- W3133793003 hasConcept C89600930 @default.
- W3133793003 hasConcept C9940772 @default.
- W3133793003 hasConceptScore W3133793003C106131492 @default.
- W3133793003 hasConceptScore W3133793003C138885662 @default.
- W3133793003 hasConceptScore W3133793003C153180895 @default.
- W3133793003 hasConceptScore W3133793003C154945302 @default.
- W3133793003 hasConceptScore W3133793003C15744967 @default.
- W3133793003 hasConceptScore W3133793003C159985019 @default.
- W3133793003 hasConceptScore W3133793003C169760540 @default.
- W3133793003 hasConceptScore W3133793003C192562407 @default.
- W3133793003 hasConceptScore W3133793003C204323151 @default.
- W3133793003 hasConceptScore W3133793003C26760741 @default.
- W3133793003 hasConceptScore W3133793003C2776401178 @default.
- W3133793003 hasConceptScore W3133793003C28490314 @default.
- W3133793003 hasConceptScore W3133793003C31972630 @default.
- W3133793003 hasConceptScore W3133793003C41008148 @default.
- W3133793003 hasConceptScore W3133793003C41895202 @default.
- W3133793003 hasConceptScore W3133793003C45273575 @default.
- W3133793003 hasConceptScore W3133793003C47401133 @default.
- W3133793003 hasConceptScore W3133793003C61328038 @default.
- W3133793003 hasConceptScore W3133793003C88485024 @default.