Matches in SemOpenAlex for { <https://semopenalex.org/work/W2960115000> ?p ?o ?g. }
Showing items 1 to 81 of
81
with 100 items per page.
- W2960115000 abstract "For the study of single-modal recognition, for example, the research on speech signals, ECG signals, facial expressions, body postures and other physiological signals have made some progress. However, the diversity of human brain information sources and the uncertainty of single-modal recognition determine that the accuracy of single-modal recognition is not high. Therefore, building a multimodal recognition framework in combination with multiple modalities has become an effective means of improving performance. With the rise of multi-modal machine learning, multi-modal information fusion has become a research hotspot, and audio-visual fusion is the most widely used direction. The audio-visual fusion method has been successfully applied to various problems, such as emotion recognition and multimedia event detection, biometric and speech recognition applications. This paper firstly introduces multimodal machine learning briefly, and then summarizes the development and current situation of audio-visual fusion technology in some major areas, and finally puts forward the prospect for the future." @default.
- W2960115000 created "2019-07-23" @default.
- W2960115000 creator A5001256961 @default.
- W2960115000 creator A5009718960 @default.
- W2960115000 creator A5035416128 @default.
- W2960115000 creator A5072315367 @default.
- W2960115000 creator A5082437433 @default.
- W2960115000 creator A5086873509 @default.
- W2960115000 date "2019-06-01" @default.
- W2960115000 modified "2023-10-16" @default.
- W2960115000 title "A Review of Audio-Visual Fusion with Machine Learning" @default.
- W2960115000 cites W2053101950 @default.
- W2960115000 cites W2077395415 @default.
- W2960115000 cites W2112555538 @default.
- W2960115000 cites W2123260696 @default.
- W2960115000 cites W2152239535 @default.
- W2960115000 cites W2474638510 @default.
- W2960115000 cites W2516412152 @default.
- W2960115000 doi "https://doi.org/10.1088/1742-6596/1237/2/022144" @default.
- W2960115000 hasPublicationYear "2019" @default.
- W2960115000 type Work @default.
- W2960115000 sameAs 2960115000 @default.
- W2960115000 citedByCount "6" @default.
- W2960115000 countsByYear W29601150002020 @default.
- W2960115000 countsByYear W29601150002021 @default.
- W2960115000 countsByYear W29601150002022 @default.
- W2960115000 crossrefType "journal-article" @default.
- W2960115000 hasAuthorship W2960115000A5001256961 @default.
- W2960115000 hasAuthorship W2960115000A5009718960 @default.
- W2960115000 hasAuthorship W2960115000A5035416128 @default.
- W2960115000 hasAuthorship W2960115000A5072315367 @default.
- W2960115000 hasAuthorship W2960115000A5082437433 @default.
- W2960115000 hasAuthorship W2960115000A5086873509 @default.
- W2960115000 hasBestOaLocation W29601150001 @default.
- W2960115000 hasConcept C107457646 @default.
- W2960115000 hasConcept C119857082 @default.
- W2960115000 hasConcept C144024400 @default.
- W2960115000 hasConcept C153180895 @default.
- W2960115000 hasConcept C154945302 @default.
- W2960115000 hasConcept C184297639 @default.
- W2960115000 hasConcept C185592680 @default.
- W2960115000 hasConcept C188027245 @default.
- W2960115000 hasConcept C2779903281 @default.
- W2960115000 hasConcept C28490314 @default.
- W2960115000 hasConcept C3017588708 @default.
- W2960115000 hasConcept C36289849 @default.
- W2960115000 hasConcept C41008148 @default.
- W2960115000 hasConcept C49774154 @default.
- W2960115000 hasConcept C71139939 @default.
- W2960115000 hasConceptScore W2960115000C107457646 @default.
- W2960115000 hasConceptScore W2960115000C119857082 @default.
- W2960115000 hasConceptScore W2960115000C144024400 @default.
- W2960115000 hasConceptScore W2960115000C153180895 @default.
- W2960115000 hasConceptScore W2960115000C154945302 @default.
- W2960115000 hasConceptScore W2960115000C184297639 @default.
- W2960115000 hasConceptScore W2960115000C185592680 @default.
- W2960115000 hasConceptScore W2960115000C188027245 @default.
- W2960115000 hasConceptScore W2960115000C2779903281 @default.
- W2960115000 hasConceptScore W2960115000C28490314 @default.
- W2960115000 hasConceptScore W2960115000C3017588708 @default.
- W2960115000 hasConceptScore W2960115000C36289849 @default.
- W2960115000 hasConceptScore W2960115000C41008148 @default.
- W2960115000 hasConceptScore W2960115000C49774154 @default.
- W2960115000 hasConceptScore W2960115000C71139939 @default.
- W2960115000 hasLocation W29601150001 @default.
- W2960115000 hasOpenAccess W2960115000 @default.
- W2960115000 hasPrimaryLocation W29601150001 @default.
- W2960115000 hasRelatedWork W10183233 @default.
- W2960115000 hasRelatedWork W11855449 @default.
- W2960115000 hasRelatedWork W12129214 @default.
- W2960115000 hasRelatedWork W12940689 @default.
- W2960115000 hasRelatedWork W13607926 @default.
- W2960115000 hasRelatedWork W13824394 @default.
- W2960115000 hasRelatedWork W2131697 @default.
- W2960115000 hasRelatedWork W2899263 @default.
- W2960115000 hasRelatedWork W8219677 @default.
- W2960115000 hasRelatedWork W8573541 @default.
- W2960115000 isParatext "false" @default.
- W2960115000 isRetracted "false" @default.
- W2960115000 magId "2960115000" @default.
- W2960115000 workType "article" @default.