Matches in SemOpenAlex for { <https://semopenalex.org/work/W3158375352> ?p ?o ?g. }
- W3158375352 endingPage "660" @default.
- W3158375352 startingPage "652" @default.
- W3158375352 abstract "Large-scale pre-training has recently revolutionized vision-and-language (VL) research. Models such as LXMERT and UNITER have significantly lifted the state of the art over a wide range of VL tasks. However, the large number of parameters in such models hinders their application in practice. In parallel, work on the lottery ticket hypothesis (LTH) has shown that deep neural networks contain small matching subnetworks that can achieve on par or even better performance than the dense networks when trained in isolation. In this work, we perform the first empirical study to assess whether such trainable subnetworks also exist in pre-trained VL models. We use UNITER as the main testbed (also test on LXMERT and ViLT), and consolidate 7 representative VL tasks for experiments, including visual question answering, visual commonsense reasoning, visual entailment, referring expression comprehension, image-text retrieval, GQA, and NLVR2. Through comprehensive analysis, we summarize our main findings as follows. (i) It is difficult to find subnetworks that strictly match the performance of the full model. However, we can find relaxed winning tickets at 50%-70% sparsity that maintain 99% of the full accuracy. (ii) Subnetworks found by task-specific pruning transfer reasonably well to the other tasks, while those found on the pre-training tasks at 60%/70% sparsity transfer universally, matching 98%/96% of the full accuracy on average over all the tasks. (iii) Besides UNITER, other models such as LXMERT and ViLT can also play lottery tickets. However, the highest sparsity we can achieve for ViLT is far lower than LXMERT and UNITER (30% vs. 70%). (iv) LTH also remains relevant when using other training methods (e.g., adversarial training)." @default.
- W3158375352 created "2021-05-10" @default.
- W3158375352 creator A5026746295 @default.
- W3158375352 creator A5028783832 @default.
- W3158375352 creator A5034826937 @default.
- W3158375352 creator A5037467245 @default.
- W3158375352 creator A5058379057 @default.
- W3158375352 creator A5066666034 @default.
- W3158375352 creator A5077322975 @default.
- W3158375352 date "2022-06-28" @default.
- W3158375352 modified "2023-10-17" @default.
- W3158375352 title "Playing Lottery Tickets with Vision and Language" @default.
- W3158375352 cites W1861492603 @default.
- W3158375352 cites W1933349210 @default.
- W3158375352 cites W2109586012 @default.
- W3158375352 cites W2277195237 @default.
- W3158375352 cites W2489434015 @default.
- W3158375352 cites W2560730294 @default.
- W3158375352 cites W2626778328 @default.
- W3158375352 cites W2640329709 @default.
- W3158375352 cites W2745461083 @default.
- W3158375352 cites W2886641317 @default.
- W3158375352 cites W2896409484 @default.
- W3158375352 cites W2899335602 @default.
- W3158375352 cites W2912371042 @default.
- W3158375352 cites W2915589364 @default.
- W3158375352 cites W2949178656 @default.
- W3158375352 cites W2949474740 @default.
- W3158375352 cites W2953488952 @default.
- W3158375352 cites W2962964995 @default.
- W3158375352 cites W2963115613 @default.
- W3158375352 cites W2963310665 @default.
- W3158375352 cites W2963341956 @default.
- W3158375352 cites W2963518342 @default.
- W3158375352 cites W2963813662 @default.
- W3158375352 cites W2964299589 @default.
- W3158375352 cites W2968124245 @default.
- W3158375352 cites W2968880719 @default.
- W3158375352 cites W2970231061 @default.
- W3158375352 cites W2970608575 @default.
- W3158375352 cites W2975357369 @default.
- W3158375352 cites W2981851019 @default.
- W3158375352 cites W2993313557 @default.
- W3158375352 cites W2994914025 @default.
- W3158375352 cites W2995197005 @default.
- W3158375352 cites W2995460200 @default.
- W3158375352 cites W2995492258 @default.
- W3158375352 cites W2995816250 @default.
- W3158375352 cites W2996309822 @default.
- W3158375352 cites W2997591391 @default.
- W3158375352 cites W3006647218 @default.
- W3158375352 cites W3008374555 @default.
- W3158375352 cites W3008604580 @default.
- W3158375352 cites W3014611590 @default.
- W3158375352 cites W3016923549 @default.
- W3158375352 cites W3022969335 @default.
- W3158375352 cites W3023074479 @default.
- W3158375352 cites W3023306062 @default.
- W3158375352 cites W3023633125 @default.
- W3158375352 cites W3034665240 @default.
- W3158375352 cites W3034733718 @default.
- W3158375352 cites W3034837210 @default.
- W3158375352 cites W3035081900 @default.
- W3158375352 cites W3035204084 @default.
- W3158375352 cites W3035265375 @default.
- W3158375352 cites W3035615218 @default.
- W3158375352 cites W3035688398 @default.
- W3158375352 cites W3036267641 @default.
- W3158375352 cites W3038476992 @default.
- W3158375352 cites W3044511083 @default.
- W3158375352 cites W3090449556 @default.
- W3158375352 cites W3091177855 @default.
- W3158375352 cites W3092574632 @default.
- W3158375352 cites W3104263050 @default.
- W3158375352 cites W3105249545 @default.
- W3158375352 cites W3106784008 @default.
- W3158375352 cites W3108144224 @default.
- W3158375352 cites W3110662498 @default.
- W3158375352 cites W3111265704 @default.
- W3158375352 cites W3111921445 @default.
- W3158375352 cites W3112156821 @default.
- W3158375352 cites W3113067643 @default.
- W3158375352 cites W3116651605 @default.
- W3158375352 cites W3118492869 @default.
- W3158375352 cites W3119795462 @default.
- W3158375352 cites W3120237956 @default.
- W3158375352 cites W3120519792 @default.
- W3158375352 cites W3126337491 @default.
- W3158375352 cites W3126464137 @default.
- W3158375352 cites W3126792443 @default.
- W3158375352 cites W3126996274 @default.
- W3158375352 cites W3127384563 @default.
- W3158375352 cites W3129197243 @default.
- W3158375352 cites W3129576130 @default.
- W3158375352 cites W3134249459 @default.
- W3158375352 cites W3135285736 @default.
- W3158375352 cites W3135367836 @default.
- W3158375352 cites W3155860693 @default.