Matches in SemOpenAlex for { <https://semopenalex.org/work/W3035522059> ?p ?o ?g. }
Showing items 1 to 95 of
95
with 100 items per page.
- W3035522059 abstract "In recent years, transformer models have achieved great success in natural language processing (NLP) tasks. Most of the current state-of-the-art NLP results are achieved by using monolingual transformer models, where the model is pre-trained using a single language unlabelled text corpus. Then, the model is fine-tuned to the specific downstream task. However, the cost of pre-training a new transformer model is high for most languages. In this work, we propose a cost-effective transfer learning method to adopt a strong source language model, trained from a large monolingual corpus to a low-resource language. Thus, using XLNet language model, we demonstrate competitive performance with mBERT and a pre-trained target language model on the cross-lingual sentiment (CLS) dataset and on a new sentiment analysis dataset for low-resourced language Tigrinya. With only 10k examples of the given Tigrinya sentiment analysis dataset, English XLNet has achieved 78.88% F1-Score outperforming BERT and mBERT by 10% and 7%, respectively. More interestingly, fine-tuning (English) XLNet model on the CLS dataset has promising results compared to mBERT and even outperformed mBERT for one dataset of the Japanese language." @default.
- W3035522059 created "2020-06-19" @default.
- W3035522059 creator A5037259225 @default.
- W3035522059 creator A5072131328 @default.
- W3035522059 creator A5073024862 @default.
- W3035522059 date "2020-06-13" @default.
- W3035522059 modified "2023-09-26" @default.
- W3035522059 title "Transferring Monolingual Model to Low-Resource Language: The Case of Tigrinya" @default.
- W3035522059 cites W1521626219 @default.
- W3035522059 cites W2108646579 @default.
- W3035522059 cites W2171068337 @default.
- W3035522059 cites W2250311666 @default.
- W3035522059 cites W2473679434 @default.
- W3035522059 cites W2525778437 @default.
- W3035522059 cites W2800288211 @default.
- W3035522059 cites W2885185669 @default.
- W3035522059 cites W2936497627 @default.
- W3035522059 cites W2950577311 @default.
- W3035522059 cites W2963341956 @default.
- W3035522059 cites W2963403868 @default.
- W3035522059 cites W2963748441 @default.
- W3035522059 cites W2964583233 @default.
- W3035522059 cites W2970049541 @default.
- W3035522059 cites W2970597249 @default.
- W3035522059 cites W2971296908 @default.
- W3035522059 cites W2982180741 @default.
- W3035522059 cites W2983040767 @default.
- W3035522059 cites W2984500026 @default.
- W3035522059 cites W2985094609 @default.
- W3035522059 cites W2995230342 @default.
- W3035522059 cites W2995647371 @default.
- W3035522059 cites W2996580882 @default.
- W3035522059 cites W2947150611 @default.
- W3035522059 hasPublicationYear "2020" @default.
- W3035522059 type Work @default.
- W3035522059 sameAs 3035522059 @default.
- W3035522059 citedByCount "1" @default.
- W3035522059 countsByYear W30355220592021 @default.
- W3035522059 crossrefType "posted-content" @default.
- W3035522059 hasAuthorship W3035522059A5037259225 @default.
- W3035522059 hasAuthorship W3035522059A5072131328 @default.
- W3035522059 hasAuthorship W3035522059A5073024862 @default.
- W3035522059 hasConcept C119599485 @default.
- W3035522059 hasConcept C119767625 @default.
- W3035522059 hasConcept C127413603 @default.
- W3035522059 hasConcept C137293760 @default.
- W3035522059 hasConcept C150899416 @default.
- W3035522059 hasConcept C154945302 @default.
- W3035522059 hasConcept C165801399 @default.
- W3035522059 hasConcept C190729725 @default.
- W3035522059 hasConcept C204321447 @default.
- W3035522059 hasConcept C41008148 @default.
- W3035522059 hasConcept C66322947 @default.
- W3035522059 hasConcept C66402592 @default.
- W3035522059 hasConcept C71924100 @default.
- W3035522059 hasConceptScore W3035522059C119599485 @default.
- W3035522059 hasConceptScore W3035522059C119767625 @default.
- W3035522059 hasConceptScore W3035522059C127413603 @default.
- W3035522059 hasConceptScore W3035522059C137293760 @default.
- W3035522059 hasConceptScore W3035522059C150899416 @default.
- W3035522059 hasConceptScore W3035522059C154945302 @default.
- W3035522059 hasConceptScore W3035522059C165801399 @default.
- W3035522059 hasConceptScore W3035522059C190729725 @default.
- W3035522059 hasConceptScore W3035522059C204321447 @default.
- W3035522059 hasConceptScore W3035522059C41008148 @default.
- W3035522059 hasConceptScore W3035522059C66322947 @default.
- W3035522059 hasConceptScore W3035522059C66402592 @default.
- W3035522059 hasConceptScore W3035522059C71924100 @default.
- W3035522059 hasLocation W30355220591 @default.
- W3035522059 hasOpenAccess W3035522059 @default.
- W3035522059 hasPrimaryLocation W30355220591 @default.
- W3035522059 hasRelatedWork W2001283107 @default.
- W3035522059 hasRelatedWork W20484969 @default.
- W3035522059 hasRelatedWork W2121538771 @default.
- W3035522059 hasRelatedWork W2122455551 @default.
- W3035522059 hasRelatedWork W2250773991 @default.
- W3035522059 hasRelatedWork W2774617188 @default.
- W3035522059 hasRelatedWork W2963165536 @default.
- W3035522059 hasRelatedWork W2992787485 @default.
- W3035522059 hasRelatedWork W3007955273 @default.
- W3035522059 hasRelatedWork W3035390927 @default.
- W3035522059 hasRelatedWork W3088631993 @default.
- W3035522059 hasRelatedWork W3089959042 @default.
- W3035522059 hasRelatedWork W3111789658 @default.
- W3035522059 hasRelatedWork W3118106810 @default.
- W3035522059 hasRelatedWork W3125826128 @default.
- W3035522059 hasRelatedWork W3157444148 @default.
- W3035522059 hasRelatedWork W3169879368 @default.
- W3035522059 hasRelatedWork W3170815021 @default.
- W3035522059 hasRelatedWork W3195013837 @default.
- W3035522059 hasRelatedWork W800621058 @default.
- W3035522059 isParatext "false" @default.
- W3035522059 isRetracted "false" @default.
- W3035522059 magId "3035522059" @default.
- W3035522059 workType "article" @default.