Matches in SemOpenAlex for { <https://semopenalex.org/work/W4386247218> ?p ?o ?g. }
Showing items 1 to 59 of
59
with 100 items per page.
- W4386247218 endingPage "4717" @default.
- W4386247218 startingPage "4717" @default.
- W4386247218 abstract "[Introduction] People typically feel uneasy when observing robots and computer graphic characters that resemble humans but are not perfectly human-like, an effect known as the “uncanny valley.” Several empirical studies examining affective responses to morphed images between human and non-human categories suggested that visual cues from two different categories elicit conflicting inferences about the entity, leading to feelings of eeriness. However, the detailed relationship between visual representations and emotional responses remains unclear. Artificial neural networks (ANNs), which can predict the relevant text description given an image, are promising models for providing insight into the processes underlying human cognition by exploring vast instances of human affective responses to visual concepts. In this study, we investigated how an ANN evaluates the matching of the morphed images to affective words used to describe uncanny valley effects in previous studies. [Methods] We created stimulus images by morphing between human faces and non-human objects at five morph levels and assessed the score of the images’ matching to words using CLIP (Contrastive Language–Image Pre-training), a state-of-the-art ANN that estimates semantic matching between an image and a caption. Ho and MacDorman proposed the indices of humanness, eeriness, and attractiveness using a semantic differential scale for evaluating the affective responses of human observers in studies of the uncanny valley. We calculated CLIP scores for the adjectives comprising the three indices and examined how these indices changed across morph levels. [Results and Conclusions] The eeriness index was highest at the midpoint of the morph continuum, where visual cue conflicts were maximal. This result indicates that CLIP associates visual cue conflicts in images with eerie impressions through training on an enormous amount of data covering our daily visual experiences. The current study explored how visual representations are related to human observers’ sentiment using ANN." @default.
- W4386247218 created "2023-08-30" @default.
- W4386247218 creator A5039624832 @default.
- W4386247218 creator A5054137505 @default.
- W4386247218 date "2023-08-01" @default.
- W4386247218 modified "2023-10-02" @default.
- W4386247218 title "In Silico Approach for Understanding the Associations Between Vision and Emotions Underlying the Uncanny Valley Effect" @default.
- W4386247218 doi "https://doi.org/10.1167/jov.23.9.4717" @default.
- W4386247218 hasPublicationYear "2023" @default.
- W4386247218 type Work @default.
- W4386247218 citedByCount "0" @default.
- W4386247218 crossrefType "journal-article" @default.
- W4386247218 hasAuthorship W4386247218A5039624832 @default.
- W4386247218 hasAuthorship W4386247218A5054137505 @default.
- W4386247218 hasBestOaLocation W43862472181 @default.
- W4386247218 hasConcept C11171543 @default.
- W4386247218 hasConcept C122980154 @default.
- W4386247218 hasConcept C154945302 @default.
- W4386247218 hasConcept C15744967 @default.
- W4386247218 hasConcept C180747234 @default.
- W4386247218 hasConcept C188147891 @default.
- W4386247218 hasConcept C2365568 @default.
- W4386247218 hasConcept C2780362631 @default.
- W4386247218 hasConcept C41008148 @default.
- W4386247218 hasConcept C50637493 @default.
- W4386247218 hasConcept C77805123 @default.
- W4386247218 hasConcept C90509273 @default.
- W4386247218 hasConceptScore W4386247218C11171543 @default.
- W4386247218 hasConceptScore W4386247218C122980154 @default.
- W4386247218 hasConceptScore W4386247218C154945302 @default.
- W4386247218 hasConceptScore W4386247218C15744967 @default.
- W4386247218 hasConceptScore W4386247218C180747234 @default.
- W4386247218 hasConceptScore W4386247218C188147891 @default.
- W4386247218 hasConceptScore W4386247218C2365568 @default.
- W4386247218 hasConceptScore W4386247218C2780362631 @default.
- W4386247218 hasConceptScore W4386247218C41008148 @default.
- W4386247218 hasConceptScore W4386247218C50637493 @default.
- W4386247218 hasConceptScore W4386247218C77805123 @default.
- W4386247218 hasConceptScore W4386247218C90509273 @default.
- W4386247218 hasIssue "9" @default.
- W4386247218 hasLocation W43862472181 @default.
- W4386247218 hasOpenAccess W4386247218 @default.
- W4386247218 hasPrimaryLocation W43862472181 @default.
- W4386247218 hasRelatedWork W1574121318 @default.
- W4386247218 hasRelatedWork W2165113252 @default.
- W4386247218 hasRelatedWork W2320037119 @default.
- W4386247218 hasRelatedWork W2774267688 @default.
- W4386247218 hasRelatedWork W2901365133 @default.
- W4386247218 hasRelatedWork W2902219462 @default.
- W4386247218 hasRelatedWork W2932195978 @default.
- W4386247218 hasRelatedWork W4246793583 @default.
- W4386247218 hasRelatedWork W4248849028 @default.
- W4386247218 hasRelatedWork W4382619041 @default.
- W4386247218 hasVolume "23" @default.
- W4386247218 isParatext "false" @default.
- W4386247218 isRetracted "false" @default.
- W4386247218 workType "article" @default.