Matches in SemOpenAlex for { <https://semopenalex.org/work/W2182474973> ?p ?o ?g. }
- W2182474973 abstract "This work attempts to extract a viewer’s emotion from the three modalities of a movie: audio, visual and text. A major obstacle for emotion research has been the lack of appropriately annotated databases, limiting the potential of supervised algorithms. To that end we develop and present a database of movie affect, annotated in continuous time, on a continuous valence-arousal scale. Supervised learning methods are proposed to model the continuous affective response using hidden Markov Models and low-level audio-visual features and classify each video frame into one of seven discrete categories (in each dimension); the discrete-valued curves are then converted to continuous values via spline interpolation. A variety of audio-visual features are investigated and an optimal feature set is selected. The potential of the method is verified on twelve 30-minute movie clips with good precision at a macroscopic level. This method proves not suitable to process subtitle information, so we explore the creation of a textual affective model, starting with a fully automated algorithm for expanding an affective lexicon with new entries. Continuous valence ratings are estimated for unseen words under the assumption that semantic similarity implies affective similarity. Starting from a set of manually annotated words, a linear model is trained using the least mean squares algorithm. The semantic similarity between the selected features and the unseen words is computed with various similarity metrics, and used to compute the valence of unseen words. The proposed algorithm performs very well on reproducing the valence ratings of the Affective Norms for English Words (ANEW) and General Inquirer datasets. We then use three simple fusion schemes to combine lexical valence scores into sentence-level scores, producing state-of-the-art results on the sentence rating task of the SemEval 2007 corpus." @default.
- W2182474973 created "2016-06-24" @default.
- W2182474973 creator A5012722544 @default.
- W2182474973 date "2012-01-01" @default.
- W2182474973 modified "2023-09-24" @default.
- W2182474973 title "Affect extraction using aural, visual and linguistic features from multimedia documents" @default.
- W2182474973 cites W115645329 @default.
- W2182474973 cites W1261896931 @default.
- W2182474973 cites W13254153 @default.
- W2182474973 cites W1514944708 @default.
- W2182474973 cites W1524684328 @default.
- W2182474973 cites W154736990 @default.
- W2182474973 cites W1556327471 @default.
- W2182474973 cites W1569087642 @default.
- W2182474973 cites W1578007978 @default.
- W2182474973 cites W1965600549 @default.
- W2182474973 cites W1969769481 @default.
- W2182474973 cites W2011664673 @default.
- W2182474973 cites W2017337590 @default.
- W2182474973 cites W2019374767 @default.
- W2182474973 cites W2030996999 @default.
- W2182474973 cites W2038712753 @default.
- W2182474973 cites W2061644206 @default.
- W2182474973 cites W2066551267 @default.
- W2182474973 cites W2067544007 @default.
- W2182474973 cites W2074875579 @default.
- W2182474973 cites W2080100102 @default.
- W2182474973 cites W2084046180 @default.
- W2182474973 cites W2099104810 @default.
- W2182474973 cites W2102134623 @default.
- W2182474973 cites W2102953093 @default.
- W2182474973 cites W2105468141 @default.
- W2182474973 cites W2105552115 @default.
- W2182474973 cites W2109606373 @default.
- W2182474973 cites W2121407761 @default.
- W2182474973 cites W2124737236 @default.
- W2182474973 cites W2134422453 @default.
- W2182474973 cites W2136930489 @default.
- W2182474973 cites W2137093606 @default.
- W2182474973 cites W2137140231 @default.
- W2182474973 cites W2137639365 @default.
- W2182474973 cites W2139990098 @default.
- W2182474973 cites W2141790691 @default.
- W2182474973 cites W2144363854 @default.
- W2182474973 cites W2149393279 @default.
- W2182474973 cites W2156709807 @default.
- W2182474973 cites W2160408828 @default.
- W2182474973 cites W2160660844 @default.
- W2182474973 cites W2161233243 @default.
- W2182474973 cites W2162095731 @default.
- W2182474973 cites W2165897980 @default.
- W2182474973 cites W2167854178 @default.
- W2182474973 cites W2168625136 @default.
- W2182474973 cites W2189553161 @default.
- W2182474973 cites W2190850524 @default.
- W2182474973 cites W2199803028 @default.
- W2182474973 cites W2234490075 @default.
- W2182474973 cites W2252754650 @default.
- W2182474973 cites W2295279924 @default.
- W2182474973 cites W2404480901 @default.
- W2182474973 cites W2473046094 @default.
- W2182474973 cites W2759687317 @default.
- W2182474973 cites W3133994440 @default.
- W2182474973 cites W38739846 @default.
- W2182474973 cites W2151543699 @default.
- W2182474973 hasPublicationYear "2012" @default.
- W2182474973 type Work @default.
- W2182474973 sameAs 2182474973 @default.
- W2182474973 citedByCount "0" @default.
- W2182474973 crossrefType "journal-article" @default.
- W2182474973 hasAuthorship W2182474973A5012722544 @default.
- W2182474973 hasConcept C119857082 @default.
- W2182474973 hasConcept C121332964 @default.
- W2182474973 hasConcept C130318100 @default.
- W2182474973 hasConcept C153180895 @default.
- W2182474973 hasConcept C154945302 @default.
- W2182474973 hasConcept C168900304 @default.
- W2182474973 hasConcept C204321447 @default.
- W2182474973 hasConcept C28490314 @default.
- W2182474973 hasConcept C41008148 @default.
- W2182474973 hasConcept C62520636 @default.
- W2182474973 hasConceptScore W2182474973C119857082 @default.
- W2182474973 hasConceptScore W2182474973C121332964 @default.
- W2182474973 hasConceptScore W2182474973C130318100 @default.
- W2182474973 hasConceptScore W2182474973C153180895 @default.
- W2182474973 hasConceptScore W2182474973C154945302 @default.
- W2182474973 hasConceptScore W2182474973C168900304 @default.
- W2182474973 hasConceptScore W2182474973C204321447 @default.
- W2182474973 hasConceptScore W2182474973C28490314 @default.
- W2182474973 hasConceptScore W2182474973C41008148 @default.
- W2182474973 hasConceptScore W2182474973C62520636 @default.
- W2182474973 hasLocation W21824749731 @default.
- W2182474973 hasOpenAccess W2182474973 @default.
- W2182474973 hasPrimaryLocation W21824749731 @default.
- W2182474973 hasRelatedWork W2090708902 @default.
- W2182474973 hasRelatedWork W2109118612 @default.
- W2182474973 hasRelatedWork W2133399553 @default.
- W2182474973 hasRelatedWork W2143219999 @default.
- W2182474973 hasRelatedWork W2241118144 @default.
- W2182474973 hasRelatedWork W2252754650 @default.