Matches in SemOpenAlex for { <https://semopenalex.org/work/W3155609600> ?p ?o ?g. }
- W3155609600 abstract "Word alignment over parallel corpora has a wide variety of applications, including learning translation lexicons, cross-lingual transfer of language processing tools, and automatic evaluation or analysis of translation outputs. The great majority of past work on word alignment has worked by performing unsupervised learning on parallel text. Recently, however, other work has demonstrated that pre-trained contextualized word embeddings derived from multilingually trained language models (LMs) prove an attractive alternative, achieving competitive results on the word alignment task even in the absence of explicit training on parallel data. In this paper, we examine methods to marry the two approaches: leveraging pre-trained LMs but fine-tuning them on parallel text with objectives designed to improve alignment quality, and proposing methods to effectively extract alignments from these fine-tuned models. We perform experiments on five language pairs and demonstrate that our model can consistently outperform previous state-of-the-art models of all varieties. In addition, we demonstrate that we are able to train multilingual word aligners that can obtain robust performance on different language pairs." @default.
- W3155609600 created "2021-04-26" @default.
- W3155609600 creator A5038315768 @default.
- W3155609600 creator A5068811427 @default.
- W3155609600 date "2021-01-01" @default.
- W3155609600 modified "2023-09-23" @default.
- W3155609600 title "Word Alignment by Fine-tuning Embeddings on Parallel Corpora" @default.
- W3155609600 cites W1411230545 @default.
- W3155609600 cites W1595430628 @default.
- W3155609600 cites W1902237438 @default.
- W3155609600 cites W1973923101 @default.
- W3155609600 cites W1989348531 @default.
- W3155609600 cites W2003447360 @default.
- W3155609600 cites W2006969979 @default.
- W3155609600 cites W2016630033 @default.
- W3155609600 cites W2038698865 @default.
- W3155609600 cites W2057069782 @default.
- W3155609600 cites W2080373976 @default.
- W3155609600 cites W2086202918 @default.
- W3155609600 cites W2121495183 @default.
- W3155609600 cites W2127863960 @default.
- W3155609600 cites W2141532438 @default.
- W3155609600 cites W2144578941 @default.
- W3155609600 cites W2148708890 @default.
- W3155609600 cites W2156985047 @default.
- W3155609600 cites W2158131535 @default.
- W3155609600 cites W2167207791 @default.
- W3155609600 cites W2169724380 @default.
- W3155609600 cites W2185606683 @default.
- W3155609600 cites W22168010 @default.
- W3155609600 cites W2250545560 @default.
- W3155609600 cites W2251149243 @default.
- W3155609600 cites W2251199281 @default.
- W3155609600 cites W2270364989 @default.
- W3155609600 cites W2401082558 @default.
- W3155609600 cites W249200060 @default.
- W3155609600 cites W2538358357 @default.
- W3155609600 cites W2574252446 @default.
- W3155609600 cites W2757931423 @default.
- W3155609600 cites W2760424551 @default.
- W3155609600 cites W2803172920 @default.
- W3155609600 cites W2886776719 @default.
- W3155609600 cites W2899879954 @default.
- W3155609600 cites W2908510526 @default.
- W3155609600 cites W2912070261 @default.
- W3155609600 cites W2915429162 @default.
- W3155609600 cites W2922349260 @default.
- W3155609600 cites W2949840430 @default.
- W3155609600 cites W2950858167 @default.
- W3155609600 cites W2950940239 @default.
- W3155609600 cites W2952328691 @default.
- W3155609600 cites W2952682849 @default.
- W3155609600 cites W2962687637 @default.
- W3155609600 cites W2962708992 @default.
- W3155609600 cites W2962739339 @default.
- W3155609600 cites W2962784628 @default.
- W3155609600 cites W2962834107 @default.
- W3155609600 cites W2962844668 @default.
- W3155609600 cites W2963250244 @default.
- W3155609600 cites W2963333747 @default.
- W3155609600 cites W2963341956 @default.
- W3155609600 cites W2963403868 @default.
- W3155609600 cites W2963499882 @default.
- W3155609600 cites W2963503967 @default.
- W3155609600 cites W2964161178 @default.
- W3155609600 cites W2964308564 @default.
- W3155609600 cites W2965373594 @default.
- W3155609600 cites W2970045405 @default.
- W3155609600 cites W2970757000 @default.
- W3155609600 cites W2987270981 @default.
- W3155609600 cites W2995118574 @default.
- W3155609600 cites W3030163527 @default.
- W3155609600 cites W3034238904 @default.
- W3155609600 cites W3035072529 @default.
- W3155609600 cites W3035087583 @default.
- W3155609600 cites W3035376412 @default.
- W3155609600 cites W3035390927 @default.
- W3155609600 cites W3035463087 @default.
- W3155609600 cites W3105813095 @default.
- W3155609600 cites W3169425228 @default.
- W3155609600 cites W565549431 @default.
- W3155609600 cites W658020064 @default.
- W3155609600 cites W1997157296 @default.
- W3155609600 doi "https://doi.org/10.18653/v1/2021.eacl-main.181" @default.
- W3155609600 hasPublicationYear "2021" @default.
- W3155609600 type Work @default.
- W3155609600 sameAs 3155609600 @default.
- W3155609600 citedByCount "41" @default.
- W3155609600 countsByYear W31556096002020 @default.
- W3155609600 countsByYear W31556096002021 @default.
- W3155609600 countsByYear W31556096002022 @default.
- W3155609600 countsByYear W31556096002023 @default.
- W3155609600 crossrefType "proceedings-article" @default.
- W3155609600 hasAuthorship W3155609600A5038315768 @default.
- W3155609600 hasAuthorship W3155609600A5068811427 @default.
- W3155609600 hasBestOaLocation W31556096001 @default.
- W3155609600 hasConcept C104317684 @default.
- W3155609600 hasConcept C105580179 @default.
- W3155609600 hasConcept C136197465 @default.
- W3155609600 hasConcept C137293760 @default.