Matches in SemOpenAlex for { <https://semopenalex.org/work/W3189447831> ?p ?o ?g. }
- W3189447831 endingPage "393" @default.
- W3189447831 startingPage "380" @default.
- W3189447831 abstract "Sentiment classification is a form of data analytics where people's feelings and attitudes toward a topic are mined from data. This tantalizing power to predict the zeitgeist means that sentiment classification has long attracted interest, but with mixed results. However, the recent development of the BERT framework and its pretrained neural language models is seeing new-found success for sentiment classification. BERT models are trained to capture word-level information via mask language modeling and sentence-level contexts via next sentence prediction tasks. Out of the box, they are adequate models for some natural language processing tasks. However, most models are further fine-tuned with domain-specific information to increase accuracy and usefulness. Motivated by the idea that a further fine-tuning step would improve the performance for downstream sentiment classification tasks, we developed TopicBERT-a BERT model fine-tuned to recognize topics at the corpus level in addition to the word and sentence levels. TopicBERT comprises two variants: TopicBERT-ATP (aspect topic prediction), which captures topic information via an auxiliary training task, and TopicBERT-TA, where topic representation is directly injected into a topic augmentation layer for sentiment classification. With TopicBERT-ATP, the topics are predetermined by an LDA mechanism and collapsed Gibbs sampling. With TopicBERT-TA, the topics can change dynamically during the training. Experimental results show that both approaches deliver the state-of-the-art performance in two different domains with SemEval 2014 Task 4. However, in a test of methods, direct augmentation outperforms further training. Comprehensive analyses in the form of ablation, parameter, and complexity studies accompany the results." @default.
- W3189447831 created "2021-08-16" @default.
- W3189447831 creator A5004393324 @default.
- W3189447831 creator A5044196762 @default.
- W3189447831 creator A5052181996 @default.
- W3189447831 creator A5087631670 @default.
- W3189447831 creator A5088680073 @default.
- W3189447831 date "2023-01-01" @default.
- W3189447831 modified "2023-09-30" @default.
- W3189447831 title "TopicBERT: A Topic-Enhanced Neural Language Model Fine-Tuned for Sentiment Classification" @default.
- W3189447831 cites W1566289585 @default.
- W3189447831 cites W1964613733 @default.
- W3189447831 cites W1985139989 @default.
- W3189447831 cites W2001082470 @default.
- W3189447831 cites W2048679005 @default.
- W3189447831 cites W2085750684 @default.
- W3189447831 cites W2109872889 @default.
- W3189447831 cites W2164672510 @default.
- W3189447831 cites W2165855670 @default.
- W3189447831 cites W2170414372 @default.
- W3189447831 cites W2238728730 @default.
- W3189447831 cites W2251124635 @default.
- W3189447831 cites W2270070752 @default.
- W3189447831 cites W2427312199 @default.
- W3189447831 cites W2465978385 @default.
- W3189447831 cites W2509019445 @default.
- W3189447831 cites W2562607067 @default.
- W3189447831 cites W2612769033 @default.
- W3189447831 cites W2740567223 @default.
- W3189447831 cites W2757541972 @default.
- W3189447831 cites W2788610610 @default.
- W3189447831 cites W2788810909 @default.
- W3189447831 cites W2789190634 @default.
- W3189447831 cites W2790855988 @default.
- W3189447831 cites W2799044502 @default.
- W3189447831 cites W2887856105 @default.
- W3189447831 cites W2891602716 @default.
- W3189447831 cites W2891778157 @default.
- W3189447831 cites W2900641211 @default.
- W3189447831 cites W2903712410 @default.
- W3189447831 cites W2911955591 @default.
- W3189447831 cites W2936967793 @default.
- W3189447831 cites W2949161734 @default.
- W3189447831 cites W2950404230 @default.
- W3189447831 cites W2962808042 @default.
- W3189447831 cites W2963104701 @default.
- W3189447831 cites W2963168371 @default.
- W3189447831 cites W2963240575 @default.
- W3189447831 cites W2963378656 @default.
- W3189447831 cites W2964098749 @default.
- W3189447831 cites W2964164368 @default.
- W3189447831 cites W2970402837 @default.
- W3189447831 cites W2970583420 @default.
- W3189447831 cites W2971088231 @default.
- W3189447831 cites W2971220558 @default.
- W3189447831 cites W2998385486 @default.
- W3189447831 cites W3049122368 @default.
- W3189447831 cites W3094173182 @default.
- W3189447831 cites W3101850416 @default.
- W3189447831 cites W3103817618 @default.
- W3189447831 cites W3105174597 @default.
- W3189447831 cites W3163767300 @default.
- W3189447831 cites W4205184193 @default.
- W3189447831 cites W4254724182 @default.
- W3189447831 cites W4297971002 @default.
- W3189447831 doi "https://doi.org/10.1109/tnnls.2021.3094987" @default.
- W3189447831 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/34357867" @default.
- W3189447831 hasPublicationYear "2023" @default.
- W3189447831 type Work @default.
- W3189447831 sameAs 3189447831 @default.
- W3189447831 citedByCount "6" @default.
- W3189447831 countsByYear W31894478312021 @default.
- W3189447831 countsByYear W31894478312022 @default.
- W3189447831 countsByYear W31894478312023 @default.
- W3189447831 crossrefType "journal-article" @default.
- W3189447831 hasAuthorship W3189447831A5004393324 @default.
- W3189447831 hasAuthorship W3189447831A5044196762 @default.
- W3189447831 hasAuthorship W3189447831A5052181996 @default.
- W3189447831 hasAuthorship W3189447831A5087631670 @default.
- W3189447831 hasAuthorship W3189447831A5088680073 @default.
- W3189447831 hasConcept C119857082 @default.
- W3189447831 hasConcept C137293760 @default.
- W3189447831 hasConcept C138885662 @default.
- W3189447831 hasConcept C154945302 @default.
- W3189447831 hasConcept C162324750 @default.
- W3189447831 hasConcept C17744445 @default.
- W3189447831 hasConcept C187736073 @default.
- W3189447831 hasConcept C199539241 @default.
- W3189447831 hasConcept C204321447 @default.
- W3189447831 hasConcept C2776359362 @default.
- W3189447831 hasConcept C2777530160 @default.
- W3189447831 hasConcept C2780451532 @default.
- W3189447831 hasConcept C41008148 @default.
- W3189447831 hasConcept C41895202 @default.
- W3189447831 hasConcept C44572571 @default.
- W3189447831 hasConcept C66402592 @default.
- W3189447831 hasConcept C90805587 @default.
- W3189447831 hasConcept C94625758 @default.