Matches in SemOpenAlex for { <https://semopenalex.org/work/W4386805215> ?p ?o ?g. }
Showing items 1 to 94 of
94
with 100 items per page.
- W4386805215 endingPage "104486" @default.
- W4386805215 startingPage "104486" @default.
- W4386805215 abstract "Large neural-based Pre-trained Language Models (PLM) have recently gained much attention due to their noteworthy performance in many downstream Information Retrieval (IR) and Natural Language Processing (NLP) tasks. PLMs can be categorized as either general-purpose, which are trained on resources such as large-scale Web corpora, and domain-specific which are trained on in-domain or mixed-domain corpora. While domain-specific PLMs have shown promising performance on domain-specific tasks, they are significantly more computationally expensive compared to general-purpose PLMs as they have to be either retrained or trained from scratch. The objective of our work in this paper is to explore whether it would be possible to leverage general-purpose PLMs to show competitive performance to domain-specific PLMs without the need for expensive retraining of the PLMs for domain-specific tasks. By focusing specifically on the recent BioASQ Biomedical Question Answering task, we show how different general-purpose PLMs show synergistic behaviour in terms of performance, which can lead to overall notable performance improvement when used in tandem with each other. More concretely, given a set of general-purpose PLMs, we propose a self-supervised method for training a classifier that systematically selects the PLM that is most likely to answer the question correctly on a per-input basis. We show that through such a selection strategy, the performance of general-purpose PLMs can become competitive with domain-specific PLMs while remaining computationally light since there is no need to retrain the large language model itself. We run experiments on the BioASQ dataset, which is a large-scale biomedical question-answering benchmark. We show that utilizing our proposed selection strategy can show statistically significant performance improvements on general-purpose language models with an average of 16.7% when using only lighter models such as DistilBERT and DistilRoBERTa, as well as 14.2% improvement when using relatively larger models such as BERT and RoBERTa and so, their performance become competitive with domain-specific large language models such as PubMedBERT." @default.
- W4386805215 created "2023-09-17" @default.
- W4386805215 creator A5049487742 @default.
- W4386805215 creator A5064660738 @default.
- W4386805215 date "2023-10-01" @default.
- W4386805215 modified "2023-10-14" @default.
- W4386805215 title "A self-supervised language model selection strategy for biomedical question answering" @default.
- W4386805215 cites W1972860770 @default.
- W4386805215 cites W1981208470 @default.
- W4386805215 cites W1994863898 @default.
- W4386805215 cites W2000577984 @default.
- W4386805215 cites W2054399842 @default.
- W4386805215 cites W2156187881 @default.
- W4386805215 cites W2162059449 @default.
- W4386805215 cites W2395579298 @default.
- W4386805215 cites W2396881363 @default.
- W4386805215 cites W2594902929 @default.
- W4386805215 cites W2911489562 @default.
- W4386805215 cites W2990887822 @default.
- W4386805215 cites W3103089229 @default.
- W4386805215 cites W3115267989 @default.
- W4386805215 cites W3134665270 @default.
- W4386805215 cites W3135939397 @default.
- W4386805215 cites W3149839747 @default.
- W4386805215 cites W3156000544 @default.
- W4386805215 cites W3190730109 @default.
- W4386805215 cites W3198431451 @default.
- W4386805215 cites W3210436917 @default.
- W4386805215 cites W4224442790 @default.
- W4386805215 cites W4293248102 @default.
- W4386805215 cites W4293393880 @default.
- W4386805215 cites W4310568840 @default.
- W4386805215 doi "https://doi.org/10.1016/j.jbi.2023.104486" @default.
- W4386805215 hasPubMedId "https://pubmed.ncbi.nlm.nih.gov/37722445" @default.
- W4386805215 hasPublicationYear "2023" @default.
- W4386805215 type Work @default.
- W4386805215 citedByCount "0" @default.
- W4386805215 crossrefType "journal-article" @default.
- W4386805215 hasAuthorship W4386805215A5049487742 @default.
- W4386805215 hasAuthorship W4386805215A5064660738 @default.
- W4386805215 hasConcept C119857082 @default.
- W4386805215 hasConcept C134306372 @default.
- W4386805215 hasConcept C137293760 @default.
- W4386805215 hasConcept C144133560 @default.
- W4386805215 hasConcept C153083717 @default.
- W4386805215 hasConcept C154945302 @default.
- W4386805215 hasConcept C155202549 @default.
- W4386805215 hasConcept C162324750 @default.
- W4386805215 hasConcept C187736073 @default.
- W4386805215 hasConcept C204321447 @default.
- W4386805215 hasConcept C2778712577 @default.
- W4386805215 hasConcept C2780451532 @default.
- W4386805215 hasConcept C33923547 @default.
- W4386805215 hasConcept C36503486 @default.
- W4386805215 hasConcept C41008148 @default.
- W4386805215 hasConcept C44291984 @default.
- W4386805215 hasConcept C95623464 @default.
- W4386805215 hasConceptScore W4386805215C119857082 @default.
- W4386805215 hasConceptScore W4386805215C134306372 @default.
- W4386805215 hasConceptScore W4386805215C137293760 @default.
- W4386805215 hasConceptScore W4386805215C144133560 @default.
- W4386805215 hasConceptScore W4386805215C153083717 @default.
- W4386805215 hasConceptScore W4386805215C154945302 @default.
- W4386805215 hasConceptScore W4386805215C155202549 @default.
- W4386805215 hasConceptScore W4386805215C162324750 @default.
- W4386805215 hasConceptScore W4386805215C187736073 @default.
- W4386805215 hasConceptScore W4386805215C204321447 @default.
- W4386805215 hasConceptScore W4386805215C2778712577 @default.
- W4386805215 hasConceptScore W4386805215C2780451532 @default.
- W4386805215 hasConceptScore W4386805215C33923547 @default.
- W4386805215 hasConceptScore W4386805215C36503486 @default.
- W4386805215 hasConceptScore W4386805215C41008148 @default.
- W4386805215 hasConceptScore W4386805215C44291984 @default.
- W4386805215 hasConceptScore W4386805215C95623464 @default.
- W4386805215 hasLocation W43868052151 @default.
- W4386805215 hasLocation W43868052152 @default.
- W4386805215 hasOpenAccess W4386805215 @default.
- W4386805215 hasPrimaryLocation W43868052151 @default.
- W4386805215 hasRelatedWork W2006651773 @default.
- W4386805215 hasRelatedWork W2014369232 @default.
- W4386805215 hasRelatedWork W2027050655 @default.
- W4386805215 hasRelatedWork W2050078012 @default.
- W4386805215 hasRelatedWork W2964413124 @default.
- W4386805215 hasRelatedWork W3028244590 @default.
- W4386805215 hasRelatedWork W3113264705 @default.
- W4386805215 hasRelatedWork W3122042562 @default.
- W4386805215 hasRelatedWork W3204607391 @default.
- W4386805215 hasRelatedWork W4254349500 @default.
- W4386805215 hasVolume "146" @default.
- W4386805215 isParatext "false" @default.
- W4386805215 isRetracted "false" @default.
- W4386805215 workType "article" @default.