Matches in SemOpenAlex for { <https://semopenalex.org/work/W3166081780> ?p ?o ?g. }
- W3166081780 abstract "Unsupervised representation learning has recently received lots of interest due to its powerful generalizability through effectively leveraging large-scale unlabeled data. There are two prevalent approaches for this, contrastive learning and generative pre-training, where the former learns representations from instance-wise discrimination tasks and the latter learns them from estimating the likelihood. These seemingly orthogonal approaches have their own strengths and weaknesses. Contrastive learning tends to extract semantic information and discards details irrelevant for classifying objects, making the representations effective for discriminative tasks while degrading robustness to out-of-distribution data. On the other hand, the generative pre-training directly estimates the data distribution, so the representations tend to be robust but not optimal for discriminative tasks. In this paper, we show that we could achieve the best of both worlds by a hybrid training scheme. Specifically, we demonstrated that a transformer-based encoder-decoder architecture trained with both contrastive and generative losses can learn highly discriminative and robust representations without hurting the generative performance. We extensively validate our approach on various tasks." @default.
- W3166081780 created "2021-06-22" @default.
- W3166081780 creator A5039532323 @default.
- W3166081780 creator A5084833958 @default.
- W3166081780 creator A5087294413 @default.
- W3166081780 date "2021-06-11" @default.
- W3166081780 modified "2023-09-27" @default.
- W3166081780 title "Hybrid Generative-Contrastive Representation Learning." @default.
- W3166081780 cites W2099471712 @default.
- W3166081780 cites W2108598243 @default.
- W3166081780 cites W2153939756 @default.
- W3166081780 cites W2187089797 @default.
- W3166081780 cites W2194775991 @default.
- W3166081780 cites W2267126114 @default.
- W3166081780 cites W2547875792 @default.
- W3166081780 cites W2770567114 @default.
- W3166081780 cites W2842511635 @default.
- W3166081780 cites W2867167548 @default.
- W3166081780 cites W2887997457 @default.
- W3166081780 cites W2896457183 @default.
- W3166081780 cites W2908510526 @default.
- W3166081780 cites W2940744433 @default.
- W3166081780 cites W2962753370 @default.
- W3166081780 cites W2963263347 @default.
- W3166081780 cites W2963403868 @default.
- W3166081780 cites W2963981733 @default.
- W3166081780 cites W2964137095 @default.
- W3166081780 cites W2964212410 @default.
- W3166081780 cites W2965373594 @default.
- W3166081780 cites W2970241862 @default.
- W3166081780 cites W2970607325 @default.
- W3166081780 cites W2970641149 @default.
- W3166081780 cites W2975059944 @default.
- W3166081780 cites W2980360762 @default.
- W3166081780 cites W2982399380 @default.
- W3166081780 cites W2995085126 @default.
- W3166081780 cites W2996035354 @default.
- W3166081780 cites W2998108143 @default.
- W3166081780 cites W3009561768 @default.
- W3166081780 cites W3030163527 @default.
- W3166081780 cites W3034445277 @default.
- W3166081780 cites W3034781633 @default.
- W3166081780 cites W3034978746 @default.
- W3166081780 cites W3035524453 @default.
- W3166081780 cites W3037932933 @default.
- W3166081780 cites W3041609598 @default.
- W3166081780 cites W3043462782 @default.
- W3166081780 cites W3045878360 @default.
- W3166081780 cites W3093157486 @default.
- W3166081780 cites W3095121901 @default.
- W3166081780 cites W3101821705 @default.
- W3166081780 cites W3106428938 @default.
- W3166081780 cites W3106646540 @default.
- W3166081780 cites W3114951884 @default.
- W3166081780 cites W3118481432 @default.
- W3166081780 cites W3136810184 @default.
- W3166081780 cites W3141023492 @default.
- W3166081780 cites W3173547984 @default.
- W3166081780 hasPublicationYear "2021" @default.
- W3166081780 type Work @default.
- W3166081780 sameAs 3166081780 @default.
- W3166081780 citedByCount "0" @default.
- W3166081780 crossrefType "posted-content" @default.
- W3166081780 hasAuthorship W3166081780A5039532323 @default.
- W3166081780 hasAuthorship W3166081780A5084833958 @default.
- W3166081780 hasAuthorship W3166081780A5087294413 @default.
- W3166081780 hasConcept C104317684 @default.
- W3166081780 hasConcept C105795698 @default.
- W3166081780 hasConcept C119857082 @default.
- W3166081780 hasConcept C121332964 @default.
- W3166081780 hasConcept C153180895 @default.
- W3166081780 hasConcept C154945302 @default.
- W3166081780 hasConcept C165801399 @default.
- W3166081780 hasConcept C167966045 @default.
- W3166081780 hasConcept C17744445 @default.
- W3166081780 hasConcept C185592680 @default.
- W3166081780 hasConcept C199539241 @default.
- W3166081780 hasConcept C204321447 @default.
- W3166081780 hasConcept C27158222 @default.
- W3166081780 hasConcept C2776359362 @default.
- W3166081780 hasConcept C33923547 @default.
- W3166081780 hasConcept C39890363 @default.
- W3166081780 hasConcept C41008148 @default.
- W3166081780 hasConcept C55493867 @default.
- W3166081780 hasConcept C59404180 @default.
- W3166081780 hasConcept C62520636 @default.
- W3166081780 hasConcept C63479239 @default.
- W3166081780 hasConcept C66322947 @default.
- W3166081780 hasConcept C94625758 @default.
- W3166081780 hasConcept C97931131 @default.
- W3166081780 hasConceptScore W3166081780C104317684 @default.
- W3166081780 hasConceptScore W3166081780C105795698 @default.
- W3166081780 hasConceptScore W3166081780C119857082 @default.
- W3166081780 hasConceptScore W3166081780C121332964 @default.
- W3166081780 hasConceptScore W3166081780C153180895 @default.
- W3166081780 hasConceptScore W3166081780C154945302 @default.
- W3166081780 hasConceptScore W3166081780C165801399 @default.
- W3166081780 hasConceptScore W3166081780C167966045 @default.
- W3166081780 hasConceptScore W3166081780C17744445 @default.
- W3166081780 hasConceptScore W3166081780C185592680 @default.