Matches in SemOpenAlex for { <https://semopenalex.org/work/W3213043232> ?p ?o ?g. }
- W3213043232 abstract "Domain adaption for word segmentation and POS tagging is a challenging problem for Chinese lexical processing. Self-training is one promising solution for it, which struggles to construct a set of high-quality pseudo training instances for the target domain. Previous work usually assumes a universal source-to-target adaption to collect such pseudo corpus, ignoring the different gaps from the target sentences to the source domain. In this work, we start from joint word segmentation and POS tagging, presenting a fine-grained domain adaption method to model the gaps accurately. We measure the gaps by one simple and intuitive metric, and adopt it to develop a pseudo target domain corpus based on fine-grained subdomains incrementally. A novel domain-mixed representation learning model is proposed accordingly to encode the multiple subdomains effectively. The whole process is performed progressively for both corpus construction and model training. Experimental results on a benchmark dataset show that our method can gain significant improvements over a vary of baselines. Extensive analyses are performed to show the advantages of our final domain adaption model as well." @default.
- W3213043232 created "2021-11-22" @default.
- W3213043232 creator A5004953265 @default.
- W3213043232 creator A5005091713 @default.
- W3213043232 creator A5005535444 @default.
- W3213043232 creator A5037746050 @default.
- W3213043232 creator A5049484480 @default.
- W3213043232 creator A5091877711 @default.
- W3213043232 date "2021-01-01" @default.
- W3213043232 modified "2023-09-25" @default.
- W3213043232 title "A Fine-Grained Domain Adaption Model for Joint Word Segmentation and POS Tagging" @default.
- W3213043232 cites W1575907248 @default.
- W3213043232 cites W1731081199 @default.
- W3213043232 cites W1940872118 @default.
- W3213043232 cites W2096204319 @default.
- W3213043232 cites W2099153428 @default.
- W3213043232 cites W2104820387 @default.
- W3213043232 cites W2110974939 @default.
- W3213043232 cites W2118045473 @default.
- W3213043232 cites W2120354757 @default.
- W3213043232 cites W2120661206 @default.
- W3213043232 cites W2127626780 @default.
- W3213043232 cites W2131953535 @default.
- W3213043232 cites W2137281231 @default.
- W3213043232 cites W2138382875 @default.
- W3213043232 cites W2143026224 @default.
- W3213043232 cites W2143612262 @default.
- W3213043232 cites W2145905222 @default.
- W3213043232 cites W2155251704 @default.
- W3213043232 cites W2159406587 @default.
- W3213043232 cites W2160097208 @default.
- W3213043232 cites W2250542158 @default.
- W3213043232 cites W2251811146 @default.
- W3213043232 cites W22861983 @default.
- W3213043232 cites W25062297 @default.
- W3213043232 cites W2593768305 @default.
- W3213043232 cites W2756655895 @default.
- W3213043232 cites W2757350179 @default.
- W3213043232 cites W2774687429 @default.
- W3213043232 cites W2786559811 @default.
- W3213043232 cites W2801805865 @default.
- W3213043232 cites W2896649846 @default.
- W3213043232 cites W2912280829 @default.
- W3213043232 cites W2962808524 @default.
- W3213043232 cites W2963223306 @default.
- W3213043232 cites W2963341956 @default.
- W3213043232 cites W2963826681 @default.
- W3213043232 cites W2963917673 @default.
- W3213043232 cites W2964278684 @default.
- W3213043232 cites W2964303773 @default.
- W3213043232 cites W2985406498 @default.
- W3213043232 cites W3001197829 @default.
- W3213043232 cites W3009033475 @default.
- W3213043232 cites W3035516800 @default.
- W3213043232 cites W3035594424 @default.
- W3213043232 cites W3088186095 @default.
- W3213043232 cites W3098065087 @default.
- W3213043232 cites W3105421296 @default.
- W3213043232 cites W3116348602 @default.
- W3213043232 cites W44506435 @default.
- W3213043232 cites W879220392 @default.
- W3213043232 doi "https://doi.org/10.18653/v1/2021.emnlp-main.291" @default.
- W3213043232 hasPublicationYear "2021" @default.
- W3213043232 type Work @default.
- W3213043232 sameAs 3213043232 @default.
- W3213043232 citedByCount "1" @default.
- W3213043232 countsByYear W32130432322023 @default.
- W3213043232 crossrefType "proceedings-article" @default.
- W3213043232 hasAuthorship W3213043232A5004953265 @default.
- W3213043232 hasAuthorship W3213043232A5005091713 @default.
- W3213043232 hasAuthorship W3213043232A5005535444 @default.
- W3213043232 hasAuthorship W3213043232A5037746050 @default.
- W3213043232 hasAuthorship W3213043232A5049484480 @default.
- W3213043232 hasAuthorship W3213043232A5091877711 @default.
- W3213043232 hasBestOaLocation W32130432321 @default.
- W3213043232 hasConcept C104317684 @default.
- W3213043232 hasConcept C111919701 @default.
- W3213043232 hasConcept C127413603 @default.
- W3213043232 hasConcept C13280743 @default.
- W3213043232 hasConcept C134306372 @default.
- W3213043232 hasConcept C138885662 @default.
- W3213043232 hasConcept C153180895 @default.
- W3213043232 hasConcept C154945302 @default.
- W3213043232 hasConcept C162324750 @default.
- W3213043232 hasConcept C170154142 @default.
- W3213043232 hasConcept C176217482 @default.
- W3213043232 hasConcept C177264268 @default.
- W3213043232 hasConcept C17744445 @default.
- W3213043232 hasConcept C18555067 @default.
- W3213043232 hasConcept C185592680 @default.
- W3213043232 hasConcept C185798385 @default.
- W3213043232 hasConcept C199360897 @default.
- W3213043232 hasConcept C199539241 @default.
- W3213043232 hasConcept C204321447 @default.
- W3213043232 hasConcept C205649164 @default.
- W3213043232 hasConcept C21547014 @default.
- W3213043232 hasConcept C2776359362 @default.
- W3213043232 hasConcept C2780801425 @default.
- W3213043232 hasConcept C28490314 @default.
- W3213043232 hasConcept C33923547 @default.