Matches in SemOpenAlex for { <https://semopenalex.org/work/W4304080215> ?p ?o ?g. }
- W4304080215 endingPage "79" @default.
- W4304080215 startingPage "64" @default.
- W4304080215 abstract "Studies in affective audio–visual correspondence learning require ground-truth data to train, validate, and test models. The number of available datasets together with benchmarks, however, is still limited. In this paper, we create a collection of three datasets (called EmoMV) for affective correspondence learning between music and video modalities. The first two datasets (called EmoMV-A, and EmoMV-B, respectively) are constructed by making use of music video segments from other available datasets. The third one called EmoMV-C is created from music videos that we self-collected from YouTube. The music-video pairs in our datasets are annotated as matched or mismatched in terms of the emotions they are conveying. The emotions are annotated by humans in the EmoMV-A dataset, while in the EmoMV-B and EmoMV-C datasets they are predicted using a pretrained deep neural network. A user study is carried out to evaluate the accuracy of the “matched” and “mismatched” labels offered in the EmoMV dataset collection. In addition to creating three new datasets, a benchmark deep neural network model for binary affective music-video correspondence classification is also proposed. This proposed benchmark model is then modified to adapt to affective music-video retrieval. Extensive experiments are carried out on all three datasets of the EmoMV collection. Experimental results demonstrate that our proposed model outperforms state-of-the-art approaches on both the binary classification and retrieval tasks. We envision that our newly created dataset collection together with the proposed benchmark models will facilitate advances in affective computing research." @default.
- W4304080215 created "2022-10-10" @default.
- W4304080215 creator A5025034643 @default.
- W4304080215 creator A5069548004 @default.
- W4304080215 creator A5084445353 @default.
- W4304080215 date "2023-03-01" @default.
- W4304080215 modified "2023-10-18" @default.
- W4304080215 title "EmoMV: Affective music-video correspondence learning datasets for classification and retrieval" @default.
- W4304080215 cites W1994477474 @default.
- W4304080215 cites W2002055708 @default.
- W4304080215 cites W2021502766 @default.
- W4304080215 cites W2044807399 @default.
- W4304080215 cites W2062930223 @default.
- W4304080215 cites W2072691998 @default.
- W4304080215 cites W2084793887 @default.
- W4304080215 cites W2114025269 @default.
- W4304080215 cites W2127023913 @default.
- W4304080215 cites W2127236700 @default.
- W4304080215 cites W2137219016 @default.
- W4304080215 cites W2143197238 @default.
- W4304080215 cites W2147863532 @default.
- W4304080215 cites W2149628368 @default.
- W4304080215 cites W2164480306 @default.
- W4304080215 cites W2593116425 @default.
- W4304080215 cites W2738453428 @default.
- W4304080215 cites W2742409927 @default.
- W4304080215 cites W2752234108 @default.
- W4304080215 cites W2790495655 @default.
- W4304080215 cites W2883430806 @default.
- W4304080215 cites W2951975883 @default.
- W4304080215 cites W2962711930 @default.
- W4304080215 cites W2973909019 @default.
- W4304080215 cites W2990604978 @default.
- W4304080215 cites W3003908700 @default.
- W4304080215 cites W3041053424 @default.
- W4304080215 cites W3141144320 @default.
- W4304080215 cites W3154807520 @default.
- W4304080215 cites W3185593275 @default.
- W4304080215 cites W3202295776 @default.
- W4304080215 cites W3209710747 @default.
- W4304080215 cites W4200095085 @default.
- W4304080215 cites W4294877277 @default.
- W4304080215 doi "https://doi.org/10.1016/j.inffus.2022.10.002" @default.
- W4304080215 hasPublicationYear "2023" @default.
- W4304080215 type Work @default.
- W4304080215 citedByCount "1" @default.
- W4304080215 countsByYear W43040802152022 @default.
- W4304080215 crossrefType "journal-article" @default.
- W4304080215 hasAuthorship W4304080215A5025034643 @default.
- W4304080215 hasAuthorship W4304080215A5069548004 @default.
- W4304080215 hasAuthorship W4304080215A5084445353 @default.
- W4304080215 hasConcept C108583219 @default.
- W4304080215 hasConcept C119857082 @default.
- W4304080215 hasConcept C12267149 @default.
- W4304080215 hasConcept C13280743 @default.
- W4304080215 hasConcept C144024400 @default.
- W4304080215 hasConcept C144133560 @default.
- W4304080215 hasConcept C146849305 @default.
- W4304080215 hasConcept C153180895 @default.
- W4304080215 hasConcept C154945302 @default.
- W4304080215 hasConcept C162853370 @default.
- W4304080215 hasConcept C185798385 @default.
- W4304080215 hasConcept C205649164 @default.
- W4304080215 hasConcept C2779903281 @default.
- W4304080215 hasConcept C33923547 @default.
- W4304080215 hasConcept C36289849 @default.
- W4304080215 hasConcept C41008148 @default.
- W4304080215 hasConcept C48372109 @default.
- W4304080215 hasConcept C50644808 @default.
- W4304080215 hasConcept C66905080 @default.
- W4304080215 hasConcept C86251818 @default.
- W4304080215 hasConcept C94375191 @default.
- W4304080215 hasConceptScore W4304080215C108583219 @default.
- W4304080215 hasConceptScore W4304080215C119857082 @default.
- W4304080215 hasConceptScore W4304080215C12267149 @default.
- W4304080215 hasConceptScore W4304080215C13280743 @default.
- W4304080215 hasConceptScore W4304080215C144024400 @default.
- W4304080215 hasConceptScore W4304080215C144133560 @default.
- W4304080215 hasConceptScore W4304080215C146849305 @default.
- W4304080215 hasConceptScore W4304080215C153180895 @default.
- W4304080215 hasConceptScore W4304080215C154945302 @default.
- W4304080215 hasConceptScore W4304080215C162853370 @default.
- W4304080215 hasConceptScore W4304080215C185798385 @default.
- W4304080215 hasConceptScore W4304080215C205649164 @default.
- W4304080215 hasConceptScore W4304080215C2779903281 @default.
- W4304080215 hasConceptScore W4304080215C33923547 @default.
- W4304080215 hasConceptScore W4304080215C36289849 @default.
- W4304080215 hasConceptScore W4304080215C41008148 @default.
- W4304080215 hasConceptScore W4304080215C48372109 @default.
- W4304080215 hasConceptScore W4304080215C50644808 @default.
- W4304080215 hasConceptScore W4304080215C66905080 @default.
- W4304080215 hasConceptScore W4304080215C86251818 @default.
- W4304080215 hasConceptScore W4304080215C94375191 @default.
- W4304080215 hasFunder F4320320751 @default.
- W4304080215 hasLocation W43040802151 @default.
- W4304080215 hasOpenAccess W4304080215 @default.
- W4304080215 hasPrimaryLocation W43040802151 @default.
- W4304080215 hasRelatedWork W1490753184 @default.