Matches in SemOpenAlex for { <https://semopenalex.org/work/W2890220768> ?p ?o ?g. }
Showing items 1 to 88 of
88
with 100 items per page.
- W2890220768 abstract "Neural machine translation (NMT) models are usually trained with the word-level loss using the teacher forcing algorithm, which not only evaluates the translation improperly but also suffers from exposure bias. Sequence-level training under the reinforcement framework can mitigate the problems of the word-level loss, but its performance is unstable due to the high variance of the gradient estimation. On these grounds, we present a method with a differentiable sequence-level training objective based on probabilistic n-gram matching which can avoid the reinforcement framework. In addition, this method performs greedy search in the training which uses the predicted words as context just as at inference to alleviate the problem of exposure bias. Experiment results on the NIST Chinese-to-English translation tasks show that our method significantly outperforms the reinforcement-based algorithms and achieves an improvement of 1.5 BLEU points on average over a strong baseline system." @default.
- W2890220768 created "2018-09-27" @default.
- W2890220768 creator A5000232528 @default.
- W2890220768 creator A5056500539 @default.
- W2890220768 creator A5083420537 @default.
- W2890220768 date "2018-01-01" @default.
- W2890220768 modified "2023-10-11" @default.
- W2890220768 title "Greedy Search with Probabilistic N-gram Matching for Neural Machine Translation" @default.
- W2890220768 cites W1753482797 @default.
- W2890220768 cites W2016589492 @default.
- W2890220768 cites W2078861931 @default.
- W2890220768 cites W2101105183 @default.
- W2890220768 cites W2130942839 @default.
- W2890220768 cites W2149327368 @default.
- W2890220768 cites W2155027007 @default.
- W2890220768 cites W2157331557 @default.
- W2890220768 cites W2268617045 @default.
- W2890220768 cites W2487501366 @default.
- W2890220768 cites W2542835211 @default.
- W2890220768 cites W2546938941 @default.
- W2890220768 cites W2601324753 @default.
- W2890220768 cites W2896060389 @default.
- W2890220768 cites W2963141266 @default.
- W2890220768 cites W2963163972 @default.
- W2890220768 cites W2963206679 @default.
- W2890220768 cites W2963246629 @default.
- W2890220768 cites W2963403868 @default.
- W2890220768 cites W2963463964 @default.
- W2890220768 cites W2964265128 @default.
- W2890220768 cites W2964308564 @default.
- W2890220768 cites W648786980 @default.
- W2890220768 cites W6908809 @default.
- W2890220768 doi "https://doi.org/10.18653/v1/d18-1510" @default.
- W2890220768 hasPublicationYear "2018" @default.
- W2890220768 type Work @default.
- W2890220768 sameAs 2890220768 @default.
- W2890220768 citedByCount "23" @default.
- W2890220768 countsByYear W28902207682019 @default.
- W2890220768 countsByYear W28902207682020 @default.
- W2890220768 countsByYear W28902207682021 @default.
- W2890220768 countsByYear W28902207682023 @default.
- W2890220768 crossrefType "proceedings-article" @default.
- W2890220768 hasAuthorship W2890220768A5000232528 @default.
- W2890220768 hasAuthorship W2890220768A5056500539 @default.
- W2890220768 hasAuthorship W2890220768A5083420537 @default.
- W2890220768 hasBestOaLocation W28902207681 @default.
- W2890220768 hasConcept C105795698 @default.
- W2890220768 hasConcept C11413529 @default.
- W2890220768 hasConcept C117884012 @default.
- W2890220768 hasConcept C137293760 @default.
- W2890220768 hasConcept C154945302 @default.
- W2890220768 hasConcept C165064840 @default.
- W2890220768 hasConcept C203005215 @default.
- W2890220768 hasConcept C33923547 @default.
- W2890220768 hasConcept C41008148 @default.
- W2890220768 hasConcept C49937458 @default.
- W2890220768 hasConcept C51823790 @default.
- W2890220768 hasConcept C68859911 @default.
- W2890220768 hasConceptScore W2890220768C105795698 @default.
- W2890220768 hasConceptScore W2890220768C11413529 @default.
- W2890220768 hasConceptScore W2890220768C117884012 @default.
- W2890220768 hasConceptScore W2890220768C137293760 @default.
- W2890220768 hasConceptScore W2890220768C154945302 @default.
- W2890220768 hasConceptScore W2890220768C165064840 @default.
- W2890220768 hasConceptScore W2890220768C203005215 @default.
- W2890220768 hasConceptScore W2890220768C33923547 @default.
- W2890220768 hasConceptScore W2890220768C41008148 @default.
- W2890220768 hasConceptScore W2890220768C49937458 @default.
- W2890220768 hasConceptScore W2890220768C51823790 @default.
- W2890220768 hasConceptScore W2890220768C68859911 @default.
- W2890220768 hasLocation W28902207681 @default.
- W2890220768 hasLocation W28902207682 @default.
- W2890220768 hasOpenAccess W2890220768 @default.
- W2890220768 hasPrimaryLocation W28902207681 @default.
- W2890220768 hasRelatedWork W167758659 @default.
- W2890220768 hasRelatedWork W1975136746 @default.
- W2890220768 hasRelatedWork W2028872578 @default.
- W2890220768 hasRelatedWork W2251166476 @default.
- W2890220768 hasRelatedWork W2277058918 @default.
- W2890220768 hasRelatedWork W2348466612 @default.
- W2890220768 hasRelatedWork W2375924459 @default.
- W2890220768 hasRelatedWork W2534637008 @default.
- W2890220768 hasRelatedWork W3092662895 @default.
- W2890220768 hasRelatedWork W69519455 @default.
- W2890220768 isParatext "false" @default.
- W2890220768 isRetracted "false" @default.
- W2890220768 magId "2890220768" @default.
- W2890220768 workType "article" @default.