Matches in SemOpenAlex for { <https://semopenalex.org/work/W4377700710> ?p ?o ?g. }
Showing items 1 to 77 of
77
with 100 items per page.
- W4377700710 abstract "The term Code Mixed refers to the use of more than one language in the same text. This phenomenon is predominantly observed on social media platforms, with an increasing amount of adaptation as time goes on. It is critical to detect foreign elements in a language and process them correctly, as a considerable number of individuals are using code-mixed languages that could not be comprehended by understanding one of those languages. In this work, we focus on low-resource Hindi-English code-mixed language and enhancing the performance of different code-mixed natural language processing tasks such as sentiment analysis, emotion recognition, and hate speech identification. We perform a comparative analysis of different Transformer-based language Models pre-trained using unsupervised approaches. We have included the code-mixed models like HingBERT, HingRoBERTa, HingRoBERTa-Mixed, mBERT, and non-code-mixed models like AlBERT, BERT, and RoBERTa for comparative analysis of code-mixed Hindi-English downstream tasks. We report state-of-the-art results on respective datasets using HingBERT-based models which are specifically pre-trained on real code-mixed text. Our HingBERT-based models provide significant improvements thus highlighting the poor performance of vanilla BERT models on code-mixed text." @default.
- W4377700710 created "2023-05-24" @default.
- W4377700710 creator A5007114970 @default.
- W4377700710 creator A5009725385 @default.
- W4377700710 creator A5025466638 @default.
- W4377700710 creator A5052585024 @default.
- W4377700710 creator A5092001025 @default.
- W4377700710 date "2023-04-07" @default.
- W4377700710 modified "2023-09-28" @default.
- W4377700710 title "Comparative Study of Pre-Trained BERT Models for Code-Mixed Hindi-English Data" @default.
- W4377700710 cites W2805807672 @default.
- W4377700710 cites W2949454212 @default.
- W4377700710 cites W3019581400 @default.
- W4377700710 cites W3032985518 @default.
- W4377700710 cites W3108387573 @default.
- W4377700710 doi "https://doi.org/10.1109/i2ct57861.2023.10126273" @default.
- W4377700710 hasPublicationYear "2023" @default.
- W4377700710 type Work @default.
- W4377700710 citedByCount "0" @default.
- W4377700710 crossrefType "proceedings-article" @default.
- W4377700710 hasAuthorship W4377700710A5007114970 @default.
- W4377700710 hasAuthorship W4377700710A5009725385 @default.
- W4377700710 hasAuthorship W4377700710A5025466638 @default.
- W4377700710 hasAuthorship W4377700710A5052585024 @default.
- W4377700710 hasAuthorship W4377700710A5092001025 @default.
- W4377700710 hasBestOaLocation W43777007102 @default.
- W4377700710 hasConcept C121332964 @default.
- W4377700710 hasConcept C129792486 @default.
- W4377700710 hasConcept C137293760 @default.
- W4377700710 hasConcept C138885662 @default.
- W4377700710 hasConcept C154945302 @default.
- W4377700710 hasConcept C165801399 @default.
- W4377700710 hasConcept C177264268 @default.
- W4377700710 hasConcept C18552078 @default.
- W4377700710 hasConcept C195324797 @default.
- W4377700710 hasConcept C199360897 @default.
- W4377700710 hasConcept C204321447 @default.
- W4377700710 hasConcept C2776760102 @default.
- W4377700710 hasConcept C41008148 @default.
- W4377700710 hasConcept C41895202 @default.
- W4377700710 hasConcept C519982507 @default.
- W4377700710 hasConcept C62520636 @default.
- W4377700710 hasConcept C66322947 @default.
- W4377700710 hasConceptScore W4377700710C121332964 @default.
- W4377700710 hasConceptScore W4377700710C129792486 @default.
- W4377700710 hasConceptScore W4377700710C137293760 @default.
- W4377700710 hasConceptScore W4377700710C138885662 @default.
- W4377700710 hasConceptScore W4377700710C154945302 @default.
- W4377700710 hasConceptScore W4377700710C165801399 @default.
- W4377700710 hasConceptScore W4377700710C177264268 @default.
- W4377700710 hasConceptScore W4377700710C18552078 @default.
- W4377700710 hasConceptScore W4377700710C195324797 @default.
- W4377700710 hasConceptScore W4377700710C199360897 @default.
- W4377700710 hasConceptScore W4377700710C204321447 @default.
- W4377700710 hasConceptScore W4377700710C2776760102 @default.
- W4377700710 hasConceptScore W4377700710C41008148 @default.
- W4377700710 hasConceptScore W4377700710C41895202 @default.
- W4377700710 hasConceptScore W4377700710C519982507 @default.
- W4377700710 hasConceptScore W4377700710C62520636 @default.
- W4377700710 hasConceptScore W4377700710C66322947 @default.
- W4377700710 hasLocation W43777007101 @default.
- W4377700710 hasLocation W43777007102 @default.
- W4377700710 hasOpenAccess W4377700710 @default.
- W4377700710 hasPrimaryLocation W43777007101 @default.
- W4377700710 hasRelatedWork W1997241840 @default.
- W4377700710 hasRelatedWork W2252156092 @default.
- W4377700710 hasRelatedWork W226586525 @default.
- W4377700710 hasRelatedWork W2964084554 @default.
- W4377700710 hasRelatedWork W3181003422 @default.
- W4377700710 hasRelatedWork W3184128664 @default.
- W4377700710 hasRelatedWork W3214226059 @default.
- W4377700710 hasRelatedWork W4224267299 @default.
- W4377700710 hasRelatedWork W4287070715 @default.
- W4377700710 hasRelatedWork W4287102199 @default.
- W4377700710 isParatext "false" @default.
- W4377700710 isRetracted "false" @default.
- W4377700710 workType "article" @default.