Matches in SemOpenAlex for { <https://semopenalex.org/work/W4281489607> ?p ?o ?g. }
- W4281489607 endingPage "40" @default.
- W4281489607 startingPage "29" @default.
- W4281489607 abstract "Abstract Augmented reality applications have rapidly spread across online retail platforms and social media, allowing consumers to virtually try‐on a large variety of products, such as makeup, hair dying, or shoes. However, parametrizing a renderer to synthesize realistic images of a given product remains a challenging task that requires expert knowledge. While recent work has introduced neural rendering methods for virtual try‐on from example images, current approaches are based on large generative models that cannot be used in real‐time on mobile devices. This calls for a hybrid method that combines the advantages of computer graphics and neural rendering approaches. In this paper, we propose a novel framework based on deep learning to build a real‐time inverse graphics encoder that learns to map a single example image into the parameter space of a given augmented reality rendering engine. Our method leverages self‐supervised learning and does not require labeled training data, which makes it extendable to many virtual try‐on applications. Furthermore, most augmented reality renderers are not differentiable in practice due to algorithmic choices or implementation constraints to reach real‐time on portable devices. To relax the need for a graphics‐based differentiable renderer in inverse graphics problems, we introduce a trainable imitator module. Our imitator is a generative network that learns to accurately reproduce the behavior of a given non‐differentiable renderer. We propose a novel rendering sensitivity loss to train the imitator, which ensures that the network learns an accurate and continuous representation for each rendering parameter. Automatically learning a differentiable renderer, as proposed here, could be beneficial for various inverse graphics tasks. Our framework enables novel applications where consumers can virtually try‐on a novel unknown product from an inspirational reference image on social media. It can also be used by computer graphics artists to automatically create realistic rendering from a reference product image." @default.
- W4281489607 created "2022-05-26" @default.
- W4281489607 creator A5034598217 @default.
- W4281489607 creator A5037326211 @default.
- W4281489607 creator A5065481911 @default.
- W4281489607 creator A5067711103 @default.
- W4281489607 creator A5072791873 @default.
- W4281489607 creator A5088407901 @default.
- W4281489607 creator A5090555662 @default.
- W4281489607 date "2022-05-01" @default.
- W4281489607 modified "2023-10-03" @default.
- W4281489607 title "Real‐time Virtual‐Try‐On from a Single Example Image through Deep Inverse Graphics and Learned Differentiable Renderers" @default.
- W4281489607 cites W2087681821 @default.
- W4281489607 cites W2242218935 @default.
- W4281489607 cites W2482311797 @default.
- W4281489607 cites W2769666294 @default.
- W4281489607 cites W2811490555 @default.
- W4281489607 cites W2896240508 @default.
- W4281489607 cites W2902812770 @default.
- W4281489607 cites W2950021214 @default.
- W4281489607 cites W2962770929 @default.
- W4281489607 cites W2962785568 @default.
- W4281489607 cites W2963073614 @default.
- W4281489607 cites W2963460133 @default.
- W4281489607 cites W2963767194 @default.
- W4281489607 cites W2964094136 @default.
- W4281489607 cites W2964166015 @default.
- W4281489607 cites W2981915309 @default.
- W4281489607 cites W2990251202 @default.
- W4281489607 cites W2997502326 @default.
- W4281489607 cites W3034463304 @default.
- W4281489607 cites W3043139608 @default.
- W4281489607 cites W3048765086 @default.
- W4281489607 cites W3094927600 @default.
- W4281489607 cites W3096831136 @default.
- W4281489607 cites W3106672182 @default.
- W4281489607 cites W3123632567 @default.
- W4281489607 cites W3130859581 @default.
- W4281489607 cites W3164455552 @default.
- W4281489607 cites W3172869956 @default.
- W4281489607 cites W3173855468 @default.
- W4281489607 cites W3173998882 @default.
- W4281489607 cites W3174541782 @default.
- W4281489607 cites W3175035459 @default.
- W4281489607 cites W3176825610 @default.
- W4281489607 cites W3186501579 @default.
- W4281489607 cites W4214541123 @default.
- W4281489607 cites W4214926101 @default.
- W4281489607 doi "https://doi.org/10.1111/cgf.14456" @default.
- W4281489607 hasPublicationYear "2022" @default.
- W4281489607 type Work @default.
- W4281489607 citedByCount "1" @default.
- W4281489607 countsByYear W42814896072023 @default.
- W4281489607 crossrefType "journal-article" @default.
- W4281489607 hasAuthorship W4281489607A5034598217 @default.
- W4281489607 hasAuthorship W4281489607A5037326211 @default.
- W4281489607 hasAuthorship W4281489607A5065481911 @default.
- W4281489607 hasAuthorship W4281489607A5067711103 @default.
- W4281489607 hasAuthorship W4281489607A5072791873 @default.
- W4281489607 hasAuthorship W4281489607A5088407901 @default.
- W4281489607 hasAuthorship W4281489607A5090555662 @default.
- W4281489607 hasBestOaLocation W42814896073 @default.
- W4281489607 hasConcept C109772839 @default.
- W4281489607 hasConcept C121684516 @default.
- W4281489607 hasConcept C134306372 @default.
- W4281489607 hasConcept C153715457 @default.
- W4281489607 hasConcept C154945302 @default.
- W4281489607 hasConcept C18945957 @default.
- W4281489607 hasConcept C202615002 @default.
- W4281489607 hasConcept C205711294 @default.
- W4281489607 hasConcept C21442007 @default.
- W4281489607 hasConcept C33923547 @default.
- W4281489607 hasConcept C41008148 @default.
- W4281489607 hasConcept C66629338 @default.
- W4281489607 hasConcept C77660652 @default.
- W4281489607 hasConceptScore W4281489607C109772839 @default.
- W4281489607 hasConceptScore W4281489607C121684516 @default.
- W4281489607 hasConceptScore W4281489607C134306372 @default.
- W4281489607 hasConceptScore W4281489607C153715457 @default.
- W4281489607 hasConceptScore W4281489607C154945302 @default.
- W4281489607 hasConceptScore W4281489607C18945957 @default.
- W4281489607 hasConceptScore W4281489607C202615002 @default.
- W4281489607 hasConceptScore W4281489607C205711294 @default.
- W4281489607 hasConceptScore W4281489607C21442007 @default.
- W4281489607 hasConceptScore W4281489607C33923547 @default.
- W4281489607 hasConceptScore W4281489607C41008148 @default.
- W4281489607 hasConceptScore W4281489607C66629338 @default.
- W4281489607 hasConceptScore W4281489607C77660652 @default.
- W4281489607 hasIssue "2" @default.
- W4281489607 hasLocation W42814896071 @default.
- W4281489607 hasLocation W42814896072 @default.
- W4281489607 hasLocation W42814896073 @default.
- W4281489607 hasOpenAccess W4281489607 @default.
- W4281489607 hasPrimaryLocation W42814896071 @default.
- W4281489607 hasRelatedWork W116721848 @default.
- W4281489607 hasRelatedWork W1502614013 @default.
- W4281489607 hasRelatedWork W1554690203 @default.
- W4281489607 hasRelatedWork W2080528351 @default.