Matches in SemOpenAlex for { <https://semopenalex.org/work/W2725930992> ?p ?o ?g. }
Showing items 1 to 93 of
93
with 100 items per page.
- W2725930992 endingPage "115" @default.
- W2725930992 startingPage "101" @default.
- W2725930992 abstract "Numerous studies have found evidence suggesting that facial expression identification is strongly influenced by the valence or type of facial expression shown (e.g., Taylor & Jose, 2014). Happy facial expressions are more easily categorized than most other expressions for example (Calvo & Lundqvist 2008; Hugenburg & Scezny, 2006; Leppanen & Hietanen, 2003). Moreover, recent studies have shown that dimensions such as the race or gender of the face can modulate our ability to react to these expressions (Hugenburg, 2005; Bijilstra, Holland, & Wigboldus, 2010; Bijlstra, Holland, Dotsch, Hugenberg, & Wigboldus, 2014; Palermo & Coltheart, 2004). Findings from these studies have highlighted the relevance of invariant and variant facial characteristics of the face when processing emotions from facial expressions. This is important, as it challenges current theories of face processing which have generally argued that separate systems exist for the processing of invariant features, such as face gender and variant changeable aspects of a face, such as expression (Bruce & Young, 1986; Haxby, Hoffmann, & Gobinni, 2000).Recent behavioral findings provide support of an overlap between the dimensions of face gender and expression when identifying facial expressions. Hugenburg and Scezny (2006) found that facial expressions were categorized faster and more accurately when the target face was female as opposed to male. Conversely, facial expressions were categorized more accurately in male target faces. In support of these findings, Becker, Kenrick, Neuberg, Blackwell, and Smith, (2007) observed faster reaction times and greater accuracy for categorization of male facial expressions, across a series of experiments. The authors also reported that participants were more likely to imagine a male when instructed to generate an face. The apparent association of expressions with male faces in particular has been shown in other tasks, namely gender classification (Hess, Adams, Grammer, & Kleck, 2009) and emotion rating (Fabes & Martin, 1991; Plant, Hyde, Keltner, & Devine, 2000). Several explanations have been offered in an attempt to understand the link between male faces and expressions. One possibility is that individuals hold stereotypical expectations that determine the differences in judgments of male and female faces in relation to expression (Hess, Adams, & Kleck, 2004). A classic study by Condry and Condry (1976) found when participants watched two separate video recordings of what they believed to be two different children (one boy, one girl) playing with a toy, they attributed a higher level of anger to the child that was labelled as a boy, even though really it was the same child playing in both videos. Similarly, it has been shown that children are more likely to think that a crying baby is if previously told that the baby is male as opposed to female (Haugh, Hoffman, & Cowan, 1980). These stereotypes occurring at a young age may continue through rehearsal into adulthood, causing biased expectations of face processing in males and females. Moreover, there is a growing consensus that stereotypes can considerably impact the perception of social categories in general (Bijlstra et al., 2014; Freeman & Ambady, 2011). An alternative account draws upon an evolutionary explanation, in that angry features have evolved to mimic masculinity and in contrast, happy features to mimic femininity (Le Gal & Bruce, 2002). Becker et al. (2007) showed that faces were rated more masculine and angrier when the distance between the brow and eye was shortened. A similar result was found when the distance between the eyes and mouth was manipulated in another study (Neth & Martinez, 2009). These findings converge with earlier research which claimed that the perception of certain facial creates associations with certain emotional and gender-related constructs (Brown & Perrett, 1993; Zebrowitz, 1997). …" @default.
- W2725930992 created "2017-07-14" @default.
- W2725930992 creator A5042630339 @default.
- W2725930992 date "2017-06-29" @default.
- W2725930992 modified "2023-10-18" @default.
- W2725930992 title "The Role of Fixations and Face Gender in Facial Expression Categorization" @default.
- W2725930992 cites W1963945289 @default.
- W2725930992 cites W1976840597 @default.
- W2725930992 cites W1984849553 @default.
- W2725930992 cites W1987708005 @default.
- W2725930992 cites W1993389465 @default.
- W2725930992 cites W1993715381 @default.
- W2725930992 cites W2004126537 @default.
- W2725930992 cites W2004961568 @default.
- W2725930992 cites W2006257522 @default.
- W2725930992 cites W2012945211 @default.
- W2725930992 cites W2019111214 @default.
- W2725930992 cites W2040773809 @default.
- W2725930992 cites W2051976214 @default.
- W2725930992 cites W2074709111 @default.
- W2725930992 cites W2082343263 @default.
- W2725930992 cites W2082464308 @default.
- W2725930992 cites W2087976114 @default.
- W2725930992 cites W2094171186 @default.
- W2725930992 cites W2099905961 @default.
- W2725930992 cites W2101790396 @default.
- W2725930992 cites W2109308568 @default.
- W2725930992 cites W2123026107 @default.
- W2725930992 cites W2133790671 @default.
- W2725930992 cites W2147318340 @default.
- W2725930992 cites W2148291855 @default.
- W2725930992 cites W2150283722 @default.
- W2725930992 cites W2163481013 @default.
- W2725930992 cites W3022115000 @default.
- W2725930992 doi "https://doi.org/10.24193/cbb.2017.21.07" @default.
- W2725930992 hasPublicationYear "2017" @default.
- W2725930992 type Work @default.
- W2725930992 sameAs 2725930992 @default.
- W2725930992 citedByCount "1" @default.
- W2725930992 countsByYear W27259309922020 @default.
- W2725930992 crossrefType "journal-article" @default.
- W2725930992 hasAuthorship W2725930992A5042630339 @default.
- W2725930992 hasConcept C138885662 @default.
- W2725930992 hasConcept C154945302 @default.
- W2725930992 hasConcept C15744967 @default.
- W2725930992 hasConcept C161657702 @default.
- W2725930992 hasConcept C169760540 @default.
- W2725930992 hasConcept C180747234 @default.
- W2725930992 hasConcept C195704467 @default.
- W2725930992 hasConcept C199360897 @default.
- W2725930992 hasConcept C26760741 @default.
- W2725930992 hasConcept C2779304628 @default.
- W2725930992 hasConcept C41008148 @default.
- W2725930992 hasConcept C41895202 @default.
- W2725930992 hasConcept C46312422 @default.
- W2725930992 hasConcept C90559484 @default.
- W2725930992 hasConcept C94124525 @default.
- W2725930992 hasConceptScore W2725930992C138885662 @default.
- W2725930992 hasConceptScore W2725930992C154945302 @default.
- W2725930992 hasConceptScore W2725930992C15744967 @default.
- W2725930992 hasConceptScore W2725930992C161657702 @default.
- W2725930992 hasConceptScore W2725930992C169760540 @default.
- W2725930992 hasConceptScore W2725930992C180747234 @default.
- W2725930992 hasConceptScore W2725930992C195704467 @default.
- W2725930992 hasConceptScore W2725930992C199360897 @default.
- W2725930992 hasConceptScore W2725930992C26760741 @default.
- W2725930992 hasConceptScore W2725930992C2779304628 @default.
- W2725930992 hasConceptScore W2725930992C41008148 @default.
- W2725930992 hasConceptScore W2725930992C41895202 @default.
- W2725930992 hasConceptScore W2725930992C46312422 @default.
- W2725930992 hasConceptScore W2725930992C90559484 @default.
- W2725930992 hasConceptScore W2725930992C94124525 @default.
- W2725930992 hasIssue "2" @default.
- W2725930992 hasLocation W27259309921 @default.
- W2725930992 hasOpenAccess W2725930992 @default.
- W2725930992 hasPrimaryLocation W27259309921 @default.
- W2725930992 hasRelatedWork W2002003891 @default.
- W2725930992 hasRelatedWork W2161216869 @default.
- W2725930992 hasRelatedWork W2313536284 @default.
- W2725930992 hasRelatedWork W2520810188 @default.
- W2725930992 hasRelatedWork W2612875851 @default.
- W2725930992 hasRelatedWork W2753659043 @default.
- W2725930992 hasRelatedWork W2790444018 @default.
- W2725930992 hasRelatedWork W2991602548 @default.
- W2725930992 hasRelatedWork W3158267739 @default.
- W2725930992 hasRelatedWork W4205495841 @default.
- W2725930992 hasVolume "21" @default.
- W2725930992 isParatext "false" @default.
- W2725930992 isRetracted "false" @default.
- W2725930992 magId "2725930992" @default.
- W2725930992 workType "article" @default.