Matches in SemOpenAlex for { <https://semopenalex.org/work/W3090114880> ?p ?o ?g. }
- W3090114880 abstract "Contrastive learning has become a key component of self-supervised learning approaches for computer vision. By learning to embed two augmented versions of the same image close to each other and to push the embeddings of different images apart, one can train highly transferable visual representations. As revealed by recent studies, heavy data augmentation and large sets of negatives are both crucial in learning such representations. At the same time, data mixing strategies either at the image or the feature level improve both supervised and semi-supervised learning by synthesizing novel examples, forcing networks to learn more robust features. In this paper, we argue that an important aspect of contrastive learning, i.e., the effect of hard negatives, has so far been neglected. To get more meaningful negative samples, current top contrastive self-supervised learning approaches either substantially increase the batch sizes, or keep very large memory banks; increasing the memory size, however, leads to diminishing returns in terms of performance. We therefore start by delving deeper into a top-performing framework and show evidence that harder negatives are needed to facilitate better and faster learning. Based on these observations, and motivated by the success of data mixing, we propose hard negative mixing strategies at the feature level, that can be computed on-the-fly with a minimal computational overhead. We exhaustively ablate our approach on linear classification, object detection and instance segmentation and show that employing our hard negative mixing procedure improves the quality of visual representations learned by a state-of-the-art self-supervised learning method." @default.
- W3090114880 created "2020-10-08" @default.
- W3090114880 creator A5036113570 @default.
- W3090114880 creator A5042126127 @default.
- W3090114880 creator A5054497181 @default.
- W3090114880 creator A5055724288 @default.
- W3090114880 creator A5059863828 @default.
- W3090114880 date "2020-10-02" @default.
- W3090114880 modified "2023-10-01" @default.
- W3090114880 title "Hard Negative Mixing for Contrastive Learning" @default.
- W3090114880 cites W1861492603 @default.
- W3090114880 cites W1869500417 @default.
- W3090114880 cites W2031489346 @default.
- W3090114880 cites W2116435618 @default.
- W3090114880 cites W2117539524 @default.
- W3090114880 cites W2148349024 @default.
- W3090114880 cites W2187089797 @default.
- W3090114880 cites W2321533354 @default.
- W3090114880 cites W2487442924 @default.
- W3090114880 cites W2550462002 @default.
- W3090114880 cites W2606611007 @default.
- W3090114880 cites W2613718673 @default.
- W3090114880 cites W2798303923 @default.
- W3090114880 cites W2798991696 @default.
- W3090114880 cites W2799087757 @default.
- W3090114880 cites W2842511635 @default.
- W3090114880 cites W2883725317 @default.
- W3090114880 cites W2887997457 @default.
- W3090114880 cites W2913939497 @default.
- W3090114880 cites W2917551568 @default.
- W3090114880 cites W2921861056 @default.
- W3090114880 cites W2948012107 @default.
- W3090114880 cites W2948242301 @default.
- W3090114880 cites W2949517790 @default.
- W3090114880 cites W2962742544 @default.
- W3090114880 cites W2962869940 @default.
- W3090114880 cites W2963150697 @default.
- W3090114880 cites W2963157250 @default.
- W3090114880 cites W2963350250 @default.
- W3090114880 cites W2963399829 @default.
- W3090114880 cites W2963740830 @default.
- W3090114880 cites W2963785020 @default.
- W3090114880 cites W2964037671 @default.
- W3090114880 cites W2964048159 @default.
- W3090114880 cites W2975357369 @default.
- W3090114880 cites W2981851019 @default.
- W3090114880 cites W2987741655 @default.
- W3090114880 cites W2990583358 @default.
- W3090114880 cites W2992308087 @default.
- W3090114880 cites W2995181141 @default.
- W3090114880 cites W2995480165 @default.
- W3090114880 cites W2995489995 @default.
- W3090114880 cites W2998388430 @default.
- W3090114880 cites W3007630669 @default.
- W3090114880 cites W3009561768 @default.
- W3090114880 cites W3010094231 @default.
- W3090114880 cites W3012410440 @default.
- W3090114880 cites W3015233197 @default.
- W3090114880 cites W3022061250 @default.
- W3090114880 cites W3029860052 @default.
- W3090114880 cites W3034345981 @default.
- W3090114880 cites W3034576826 @default.
- W3090114880 cites W3034748927 @default.
- W3090114880 cites W3034774681 @default.
- W3090114880 cites W3034781633 @default.
- W3090114880 cites W3034831986 @default.
- W3090114880 cites W3034978746 @default.
- W3090114880 cites W3035524453 @default.
- W3090114880 cites W3035635319 @default.
- W3090114880 cites W3036426227 @default.
- W3090114880 cites W3036982689 @default.
- W3090114880 cites W3037744186 @default.
- W3090114880 cites W3039465721 @default.
- W3090114880 cites W3043462782 @default.
- W3090114880 cites W3044032347 @default.
- W3090114880 cites W3045266435 @default.
- W3090114880 cites W3046882683 @default.
- W3090114880 cites W3047868264 @default.
- W3090114880 cites W3048918001 @default.
- W3090114880 cites W3080420240 @default.
- W3090114880 cites W3082701951 @default.
- W3090114880 cites W3089824566 @default.
- W3090114880 cites W3092113703 @default.
- W3090114880 cites W3092603779 @default.
- W3090114880 cites W3095121901 @default.
- W3090114880 cites W3098628719 @default.
- W3090114880 cites W3099638501 @default.
- W3090114880 cites W3100345210 @default.
- W3090114880 cites W3101280563 @default.
- W3090114880 cites W3101537861 @default.
- W3090114880 cites W3101821705 @default.
- W3090114880 cites W3101864923 @default.
- W3090114880 cites W3102229140 @default.
- W3090114880 cites W3102363610 @default.
- W3090114880 cites W3104683525 @default.
- W3090114880 cites W3105236818 @default.
- W3090114880 cites W3105422445 @default.
- W3090114880 cites W3106005682 @default.
- W3090114880 cites W3106310609 @default.
- W3090114880 cites W3106428938 @default.