Matches in SemOpenAlex for { <https://semopenalex.org/work/W4312594087> ?p ?o ?g. }
Showing items 1 to 83 of
83
with 100 items per page.
- W4312594087 abstract "Visual affordance studies what kind of interaction is possible and whether the interaction is reasonable in the current environment from an image/video. When inferring affordances of objects, semantics and relations of objects in the environment should be considered, and graph is usually used for modeling the environment context for object. Considering the weight of edge in graph describes the amount of contributed information between objects during affordance reasoning, this paper proposes VAR-Net (Visual Affordance Reasoning Network) which models the weights as graph attention coefficients and learns the weights based on objects’ semantic and visual features implying their affordances. VAR-Net achieves higher accuracy on COCO-Tasks and ADE-Affordance datasets. Experiments also explain the meaning of edge weights in VAR-Net. For a definite affordance, an object commits it more, the edges linking from it to other objects have larger weights and vice versa, which makes objects’ features distinguishable for inferring affordances." @default.
- W4312594087 created "2023-01-05" @default.
- W4312594087 creator A5030632917 @default.
- W4312594087 creator A5031436368 @default.
- W4312594087 creator A5046193794 @default.
- W4312594087 creator A5050137000 @default.
- W4312594087 creator A5060451771 @default.
- W4312594087 creator A5083450478 @default.
- W4312594087 date "2022-10-01" @default.
- W4312594087 modified "2023-09-27" @default.
- W4312594087 title "A Visual Affordance Reasoning Network Based on Graph Attention" @default.
- W4312594087 cites W1524405667 @default.
- W4312594087 cites W1891689858 @default.
- W4312594087 cites W1900424585 @default.
- W4312594087 cites W2032165333 @default.
- W4312594087 cites W2157331557 @default.
- W4312594087 cites W2194775991 @default.
- W4312594087 cites W2250539671 @default.
- W4312594087 cites W2561523096 @default.
- W4312594087 cites W2737258237 @default.
- W4312594087 cites W2773765248 @default.
- W4312594087 cites W2798897443 @default.
- W4312594087 cites W2908404712 @default.
- W4312594087 cites W2910308495 @default.
- W4312594087 cites W2945623882 @default.
- W4312594087 cites W2962984928 @default.
- W4312594087 cites W2963049618 @default.
- W4312594087 cites W2963635127 @default.
- W4312594087 cites W2967870022 @default.
- W4312594087 cites W2979791938 @default.
- W4312594087 cites W2983465317 @default.
- W4312594087 cites W3120221531 @default.
- W4312594087 doi "https://doi.org/10.1109/icdh57206.2022.00051" @default.
- W4312594087 hasPublicationYear "2022" @default.
- W4312594087 type Work @default.
- W4312594087 citedByCount "0" @default.
- W4312594087 crossrefType "proceedings-article" @default.
- W4312594087 hasAuthorship W4312594087A5030632917 @default.
- W4312594087 hasAuthorship W4312594087A5031436368 @default.
- W4312594087 hasAuthorship W4312594087A5046193794 @default.
- W4312594087 hasAuthorship W4312594087A5050137000 @default.
- W4312594087 hasAuthorship W4312594087A5060451771 @default.
- W4312594087 hasAuthorship W4312594087A5083450478 @default.
- W4312594087 hasConcept C107457646 @default.
- W4312594087 hasConcept C132525143 @default.
- W4312594087 hasConcept C151730666 @default.
- W4312594087 hasConcept C154945302 @default.
- W4312594087 hasConcept C184337299 @default.
- W4312594087 hasConcept C194995250 @default.
- W4312594087 hasConcept C199360897 @default.
- W4312594087 hasConcept C2779343474 @default.
- W4312594087 hasConcept C2781238097 @default.
- W4312594087 hasConcept C41008148 @default.
- W4312594087 hasConcept C80444323 @default.
- W4312594087 hasConcept C86803240 @default.
- W4312594087 hasConceptScore W4312594087C107457646 @default.
- W4312594087 hasConceptScore W4312594087C132525143 @default.
- W4312594087 hasConceptScore W4312594087C151730666 @default.
- W4312594087 hasConceptScore W4312594087C154945302 @default.
- W4312594087 hasConceptScore W4312594087C184337299 @default.
- W4312594087 hasConceptScore W4312594087C194995250 @default.
- W4312594087 hasConceptScore W4312594087C199360897 @default.
- W4312594087 hasConceptScore W4312594087C2779343474 @default.
- W4312594087 hasConceptScore W4312594087C2781238097 @default.
- W4312594087 hasConceptScore W4312594087C41008148 @default.
- W4312594087 hasConceptScore W4312594087C80444323 @default.
- W4312594087 hasConceptScore W4312594087C86803240 @default.
- W4312594087 hasLocation W43125940871 @default.
- W4312594087 hasOpenAccess W4312594087 @default.
- W4312594087 hasPrimaryLocation W43125940871 @default.
- W4312594087 hasRelatedWork W2012870653 @default.
- W4312594087 hasRelatedWork W2039197230 @default.
- W4312594087 hasRelatedWork W2156209078 @default.
- W4312594087 hasRelatedWork W2293509402 @default.
- W4312594087 hasRelatedWork W2417025891 @default.
- W4312594087 hasRelatedWork W2795727972 @default.
- W4312594087 hasRelatedWork W2974905926 @default.
- W4312594087 hasRelatedWork W3014080459 @default.
- W4312594087 hasRelatedWork W3024439360 @default.
- W4312594087 hasRelatedWork W4361269247 @default.
- W4312594087 isParatext "false" @default.
- W4312594087 isRetracted "false" @default.
- W4312594087 workType "article" @default.