Matches in SemOpenAlex for { <https://semopenalex.org/work/W1967950650> ?p ?o ?g. }
- W1967950650 endingPage "403" @default.
- W1967950650 startingPage "383" @default.
- W1967950650 abstract "Just as eyes are often considered a gateway to the soul, the human voice offers a window through which we gain access to our fellow human beingsʼ minds – their attitudes, intentions and feelings. Whether in talking or singing, crying or laughing, sighing or screaming, the sheer sound of a voice communicates a wealth of information that, in turn, may serve the observant listener as valuable guidepost in social interaction. But how do human beings extract information from the tone of a voice? In an attempt to answer this question, the present article reviews empirical evidence detailing the cerebral processes that underlie our ability to decode emotional information from vocal signals. The review will focus primarily on two prominent classes of vocal emotion cues: laughter and speech prosody (i.e. the tone of voice while speaking). Following a brief introduction, behavioral as well as neuroimaging data will be summarized that allows to outline cerebral mechanisms associated with the decoding of emotional voice cues, as well as the influence of various context variables (e.g. co-occurring facial and verbal emotional signals, attention focus, person-specific parameters such as gender and personality) on the respective processes. Building on the presented evidence, a cerebral network model will be introduced that proposes a differential contribution of various cortical and subcortical brain structures to the processing of emotional voice signals both in isolation and in context of accompanying (facial and verbal) emotional cues." @default.
- W1967950650 created "2016-06-24" @default.
- W1967950650 creator A5003818170 @default.
- W1967950650 creator A5017386076 @default.
- W1967950650 creator A5082448584 @default.
- W1967950650 date "2011-12-01" @default.
- W1967950650 modified "2023-10-17" @default.
- W1967950650 title "Emotional voices in context: A neurobiological model of multimodal affective information processing" @default.
- W1967950650 cites W107303403 @default.
- W1967950650 cites W1480091083 @default.
- W1967950650 cites W1497204367 @default.
- W1967950650 cites W1516970678 @default.
- W1967950650 cites W1536113745 @default.
- W1967950650 cites W1598129572 @default.
- W1967950650 cites W1606700437 @default.
- W1967950650 cites W1963554665 @default.
- W1967950650 cites W1967069922 @default.
- W1967950650 cites W1968136110 @default.
- W1967950650 cites W1970437962 @default.
- W1967950650 cites W1973374608 @default.
- W1967950650 cites W1976009244 @default.
- W1967950650 cites W1976032980 @default.
- W1967950650 cites W1984321201 @default.
- W1967950650 cites W1985575781 @default.
- W1967950650 cites W1986849659 @default.
- W1967950650 cites W1986856036 @default.
- W1967950650 cites W1987494247 @default.
- W1967950650 cites W1988731588 @default.
- W1967950650 cites W1990254102 @default.
- W1967950650 cites W1990755917 @default.
- W1967950650 cites W1992424514 @default.
- W1967950650 cites W1996925265 @default.
- W1967950650 cites W1997953694 @default.
- W1967950650 cites W1999014668 @default.
- W1967950650 cites W2003547693 @default.
- W1967950650 cites W2005746628 @default.
- W1967950650 cites W2006647334 @default.
- W1967950650 cites W2006694500 @default.
- W1967950650 cites W2007462506 @default.
- W1967950650 cites W2008093325 @default.
- W1967950650 cites W2010931158 @default.
- W1967950650 cites W2014221158 @default.
- W1967950650 cites W2016660677 @default.
- W1967950650 cites W2016993843 @default.
- W1967950650 cites W2017161938 @default.
- W1967950650 cites W2018338387 @default.
- W1967950650 cites W2021891913 @default.
- W1967950650 cites W2023249809 @default.
- W1967950650 cites W2026014019 @default.
- W1967950650 cites W2026161711 @default.
- W1967950650 cites W2026720425 @default.
- W1967950650 cites W2029578773 @default.
- W1967950650 cites W2030271375 @default.
- W1967950650 cites W2032495162 @default.
- W1967950650 cites W2032512623 @default.
- W1967950650 cites W2038955527 @default.
- W1967950650 cites W2042286850 @default.
- W1967950650 cites W2046187867 @default.
- W1967950650 cites W2046188011 @default.
- W1967950650 cites W2046602920 @default.
- W1967950650 cites W2047240413 @default.
- W1967950650 cites W2048187196 @default.
- W1967950650 cites W2049126062 @default.
- W1967950650 cites W2052610531 @default.
- W1967950650 cites W2053653065 @default.
- W1967950650 cites W2054951946 @default.
- W1967950650 cites W2056239938 @default.
- W1967950650 cites W2056721047 @default.
- W1967950650 cites W2060623518 @default.
- W1967950650 cites W2062616464 @default.
- W1967950650 cites W2066458086 @default.
- W1967950650 cites W2067659395 @default.
- W1967950650 cites W2069924379 @default.
- W1967950650 cites W2070831514 @default.
- W1967950650 cites W2070968043 @default.
- W1967950650 cites W2071156237 @default.
- W1967950650 cites W2073414813 @default.
- W1967950650 cites W2073859134 @default.
- W1967950650 cites W2074886593 @default.
- W1967950650 cites W2075274068 @default.
- W1967950650 cites W2077128513 @default.
- W1967950650 cites W2077404990 @default.
- W1967950650 cites W2079564095 @default.
- W1967950650 cites W2081978839 @default.
- W1967950650 cites W2082325221 @default.
- W1967950650 cites W2082356336 @default.
- W1967950650 cites W2087889245 @default.
- W1967950650 cites W2088268442 @default.
- W1967950650 cites W2088677122 @default.
- W1967950650 cites W2088810043 @default.
- W1967950650 cites W2090469211 @default.
- W1967950650 cites W2090951183 @default.
- W1967950650 cites W2092576975 @default.
- W1967950650 cites W2093640073 @default.
- W1967950650 cites W2093798197 @default.
- W1967950650 cites W2094274886 @default.
- W1967950650 cites W2098706209 @default.
- W1967950650 cites W2103829528 @default.