Matches in SemOpenAlex for { <https://semopenalex.org/work/W3169515113> ?p ?o ?g. }
- W3169515113 abstract "Neural Transfer Learning (TL) is becoming ubiquitous in Natural Language Processing (NLP), thanks to its high performance on many tasks, especially in low-resourced scenarios. Notably, TL is widely used for neural domain adaptation to transfer valuable knowledge from high-resource to low-resource domains. In the standard fine-tuning scheme of TL, a model is initially pre-trained on a source domain and subsequently fine-tuned on a target domain and, therefore, source and target domains are trained using the same architecture. In this paper, we show through interpretation methods that such scheme, despite its efficiency, is suffering from a main limitation. Indeed, although capable of adapting to new domains, pre-trained neurons struggle with learning certain patterns that are specific to the target domain. Moreover, we shed light on the hidden negative transfer occurring despite the high relatedness between source and target domains, which may mitigate the final gain brought by transfer learning. To address these problems, we propose to augment the pre-trained model with normalised, weighted and randomly initialised units that foster a better adaptation while maintaining the valuable source knowledge. We show that our approach exhibits significant improvements to the standard fine-tuning scheme for neural domain adaptation from the news domain to the social media domain on four NLP tasks: part-of-speech tagging, chunking, named entity recognition and morphosyntactic tagging." @default.
- W3169515113 created "2021-06-22" @default.
- W3169515113 creator A5030866856 @default.
- W3169515113 creator A5032727833 @default.
- W3169515113 creator A5046279251 @default.
- W3169515113 creator A5052965014 @default.
- W3169515113 creator A5083589171 @default.
- W3169515113 date "2021-06-09" @default.
- W3169515113 modified "2023-10-16" @default.
- W3169515113 title "Neural Supervised Domain Adaptation by Augmenting Pre-trained Models with Random Units" @default.
- W3169515113 cites W1534477342 @default.
- W3169515113 cites W1594551768 @default.
- W3169515113 cites W1614298861 @default.
- W3169515113 cites W1632114991 @default.
- W3169515113 cites W1731081199 @default.
- W3169515113 cites W1817277359 @default.
- W3169515113 cites W1919803322 @default.
- W3169515113 cites W1951216520 @default.
- W3169515113 cites W2005708641 @default.
- W3169515113 cites W2025341678 @default.
- W3169515113 cites W2081621443 @default.
- W3169515113 cites W2098921539 @default.
- W3169515113 cites W2102605133 @default.
- W3169515113 cites W2106869737 @default.
- W3169515113 cites W2132339004 @default.
- W3169515113 cites W2138738738 @default.
- W3169515113 cites W2147880316 @default.
- W3169515113 cites W2153848201 @default.
- W3169515113 cites W2163074454 @default.
- W3169515113 cites W2165698076 @default.
- W3169515113 cites W2250195077 @default.
- W3169515113 cites W2250539671 @default.
- W3169515113 cites W2292919134 @default.
- W3169515113 cites W2493916176 @default.
- W3169515113 cites W2555428947 @default.
- W3169515113 cites W2563574619 @default.
- W3169515113 cites W2587265250 @default.
- W3169515113 cites W2605058246 @default.
- W3169515113 cites W2605409611 @default.
- W3169515113 cites W2606347107 @default.
- W3169515113 cites W2746059512 @default.
- W3169515113 cites W2756154119 @default.
- W3169515113 cites W2756655895 @default.
- W3169515113 cites W2760505947 @default.
- W3169515113 cites W2761988601 @default.
- W3169515113 cites W2763323349 @default.
- W3169515113 cites W2765462701 @default.
- W3169515113 cites W2766572840 @default.
- W3169515113 cites W2767204723 @default.
- W3169515113 cites W2798819017 @default.
- W3169515113 cites W2806956262 @default.
- W3169515113 cites W2880875857 @default.
- W3169515113 cites W2890494294 @default.
- W3169515113 cites W2891177506 @default.
- W3169515113 cites W2899771611 @default.
- W3169515113 cites W2906152891 @default.
- W3169515113 cites W2911307508 @default.
- W3169515113 cites W2911367093 @default.
- W3169515113 cites W2916881227 @default.
- W3169515113 cites W2922523190 @default.
- W3169515113 cites W2925907129 @default.
- W3169515113 cites W2936069362 @default.
- W3169515113 cites W2946542219 @default.
- W3169515113 cites W2946558277 @default.
- W3169515113 cites W2948073512 @default.
- W3169515113 cites W2949176808 @default.
- W3169515113 cites W2949917295 @default.
- W3169515113 cites W2954038941 @default.
- W3169515113 cites W2962739339 @default.
- W3169515113 cites W2962776659 @default.
- W3169515113 cites W2962902328 @default.
- W3169515113 cites W2963081790 @default.
- W3169515113 cites W2963090765 @default.
- W3169515113 cites W2963104543 @default.
- W3169515113 cites W2963118869 @default.
- W3169515113 cites W2963211188 @default.
- W3169515113 cites W2963341956 @default.
- W3169515113 cites W2963403868 @default.
- W3169515113 cites W2963488798 @default.
- W3169515113 cites W2963503967 @default.
- W3169515113 cites W2963563735 @default.
- W3169515113 cites W2963641259 @default.
- W3169515113 cites W2963729324 @default.
- W3169515113 cites W2963756346 @default.
- W3169515113 cites W2963759780 @default.
- W3169515113 cites W2963918774 @default.
- W3169515113 cites W2963996492 @default.
- W3169515113 cites W2964034111 @default.
- W3169515113 cites W2964072872 @default.
- W3169515113 cites W2964078775 @default.
- W3169515113 cites W2964090065 @default.
- W3169515113 cites W2964093505 @default.
- W3169515113 cites W2964109570 @default.
- W3169515113 cites W2964204621 @default.
- W3169515113 cites W2964303116 @default.
- W3169515113 cites W2964303773 @default.
- W3169515113 cites W2964352358 @default.
- W3169515113 cites W2965373594 @default.
- W3169515113 cites W2970352191 @default.
- W3169515113 cites W2970597249 @default.