Matches in SemOpenAlex for { <https://semopenalex.org/work/W4384161799> ?p ?o ?g. }
Showing items 1 to 96 of
96
with 100 items per page.
- W4384161799 endingPage "72383" @default.
- W4384161799 startingPage "72374" @default.
- W4384161799 abstract "AI has introduced a new reform direction for traditional education, such as automating Grammatical Error Correction (GEC) to reduce teachers’ workload and improve efficiency. However, current GEC models still have flaws because human language is very variable, and the available labeled datasets are often too small to learn everything automatically. One of the key principles of GEC is to preserve correct parts of the input text while correcting grammatical errors. However, previous sequence-to-sequence (Seq2Seq) models may be prone to over-correction as they generate corrections from scratch. Over-correction is a phenomenon where a grammatically correct sentence is incorrectly flagged as containing errors that require correction, leading to incorrect corrections that can change the meaning or structure of the original sentence. This can significantly reduce the accuracy and usefulness of GEC systems, highlighting the need for improved approaches that can reduce over-correction and ensure more accurate and natural corrections. Recently, sequence tagging-based models have been used to mitigate this issue by only predicting edit operations that convert the source sentence to a corrected one. Despite their good performance on datasets with minimal edits, they struggle to restore texts with drastic changes. This issue artificially restricts the type of changes that can be made to a sentence and does not reflect those required for native speakers to find sentences fluent or natural sounding. Moreover, sequence tagging-based models are usually conditioned on human-designed language-specific tagging labels, hindering generalization and the real error distribution generated by diverse learners from different nationalities. In this work, we introduce a novel Seq2Seq-based approach that can handle a wide variety of grammatical errors on a low-fluency dataset. Our approach enhances the Seq2Seq architecture with a novel copy mechanism based on supervised attention. Instead of merely predicting the next token in context, the model predicts additional correctness-related information for each token. This auxiliary objective propagates into the weights of the model during training without requiring extra labels at testing time. Experimental results on benchmark datasets show that our model achieves competitive performance compared to state-of-the-art(SOTA) models." @default.
- W4384161799 created "2023-07-14" @default.
- W4384161799 creator A5000738330 @default.
- W4384161799 creator A5029344704 @default.
- W4384161799 date "2023-01-01" @default.
- W4384161799 modified "2023-09-26" @default.
- W4384161799 title "Supervised Copy Mechanism for Grammatical Error Correction" @default.
- W4384161799 cites W2098297786 @default.
- W4384161799 cites W2589277916 @default.
- W4384161799 cites W2606974598 @default.
- W4384161799 cites W2800965800 @default.
- W4384161799 cites W2902293828 @default.
- W4384161799 cites W2911857455 @default.
- W4384161799 cites W2936597270 @default.
- W4384161799 cites W2949161734 @default.
- W4384161799 cites W2962784628 @default.
- W4384161799 cites W2962870549 @default.
- W4384161799 cites W2963260202 @default.
- W4384161799 cites W2963705779 @default.
- W4384161799 cites W2963881719 @default.
- W4384161799 cites W2964165364 @default.
- W4384161799 cites W2970429618 @default.
- W4384161799 cites W2970868759 @default.
- W4384161799 cites W2985694911 @default.
- W4384161799 cites W3012064928 @default.
- W4384161799 cites W3012727582 @default.
- W4384161799 cites W3028827502 @default.
- W4384161799 cites W3034682120 @default.
- W4384161799 cites W3035010485 @default.
- W4384161799 cites W3035368332 @default.
- W4384161799 cites W3037162118 @default.
- W4384161799 cites W3096648221 @default.
- W4384161799 cites W3104564752 @default.
- W4384161799 cites W3104814493 @default.
- W4384161799 cites W3105306115 @default.
- W4384161799 cites W3141797743 @default.
- W4384161799 cites W3174851730 @default.
- W4384161799 cites W3175441946 @default.
- W4384161799 cites W4224916519 @default.
- W4384161799 cites W4281483006 @default.
- W4384161799 cites W4361006502 @default.
- W4384161799 doi "https://doi.org/10.1109/access.2023.3294979" @default.
- W4384161799 hasPublicationYear "2023" @default.
- W4384161799 type Work @default.
- W4384161799 citedByCount "0" @default.
- W4384161799 crossrefType "journal-article" @default.
- W4384161799 hasAuthorship W4384161799A5000738330 @default.
- W4384161799 hasAuthorship W4384161799A5029344704 @default.
- W4384161799 hasBestOaLocation W43841617991 @default.
- W4384161799 hasConcept C103088060 @default.
- W4384161799 hasConcept C11413529 @default.
- W4384161799 hasConcept C134306372 @default.
- W4384161799 hasConcept C154945302 @default.
- W4384161799 hasConcept C177148314 @default.
- W4384161799 hasConcept C195324797 @default.
- W4384161799 hasConcept C204321447 @default.
- W4384161799 hasConcept C2777530160 @default.
- W4384161799 hasConcept C2778112365 @default.
- W4384161799 hasConcept C28490314 @default.
- W4384161799 hasConcept C33923547 @default.
- W4384161799 hasConcept C41008148 @default.
- W4384161799 hasConcept C54355233 @default.
- W4384161799 hasConcept C86803240 @default.
- W4384161799 hasConceptScore W4384161799C103088060 @default.
- W4384161799 hasConceptScore W4384161799C11413529 @default.
- W4384161799 hasConceptScore W4384161799C134306372 @default.
- W4384161799 hasConceptScore W4384161799C154945302 @default.
- W4384161799 hasConceptScore W4384161799C177148314 @default.
- W4384161799 hasConceptScore W4384161799C195324797 @default.
- W4384161799 hasConceptScore W4384161799C204321447 @default.
- W4384161799 hasConceptScore W4384161799C2777530160 @default.
- W4384161799 hasConceptScore W4384161799C2778112365 @default.
- W4384161799 hasConceptScore W4384161799C28490314 @default.
- W4384161799 hasConceptScore W4384161799C33923547 @default.
- W4384161799 hasConceptScore W4384161799C41008148 @default.
- W4384161799 hasConceptScore W4384161799C54355233 @default.
- W4384161799 hasConceptScore W4384161799C86803240 @default.
- W4384161799 hasLocation W43841617991 @default.
- W4384161799 hasOpenAccess W4384161799 @default.
- W4384161799 hasPrimaryLocation W43841617991 @default.
- W4384161799 hasRelatedWork W159132833 @default.
- W4384161799 hasRelatedWork W2033261979 @default.
- W4384161799 hasRelatedWork W2293457016 @default.
- W4384161799 hasRelatedWork W2411652523 @default.
- W4384161799 hasRelatedWork W2502722637 @default.
- W4384161799 hasRelatedWork W2567044968 @default.
- W4384161799 hasRelatedWork W2977842567 @default.
- W4384161799 hasRelatedWork W4297803820 @default.
- W4384161799 hasRelatedWork W4322096459 @default.
- W4384161799 hasRelatedWork W1872130062 @default.
- W4384161799 hasVolume "11" @default.
- W4384161799 isParatext "false" @default.
- W4384161799 isRetracted "false" @default.
- W4384161799 workType "article" @default.