Matches in SemOpenAlex for { <https://semopenalex.org/work/W4285204876> ?p ?o ?g. }
- W4285204876 endingPage "1598" @default.
- W4285204876 startingPage "1580" @default.
- W4285204876 abstract "Deep learning (DL) techniques have been used to support several code-related tasks such as code summarization and bug-fixing. In particular, pre-trained transformer models are on the rise, also thanks to the excellent results they achieved in Natural Language Processing (NLP) tasks. The basic idea behind these models is to first pre-train them on a generic dataset using a self-supervised task (e.g., filling masked words in sentences). Then, these models are fine-tuned to support specific tasks of interest (e.g., language translation). A single model can be fine-tuned to support multiple tasks, possibly exploiting the benefits of <italic xmlns:mml=http://www.w3.org/1998/Math/MathML xmlns:xlink=http://www.w3.org/1999/xlink>transfer learning</i> . This means that knowledge acquired to solve a specific task (e.g., language translation) can be useful to boost performance on another task (e.g., sentiment classification). While the benefits of transfer learning have been widely studied in NLP, limited empirical evidence is available when it comes to code-related tasks. In this paper, we assess the performance of the Text-To-Text Transfer Transformer (T5) model in supporting four different code-related tasks: (i) automatic bug-fixing, (ii) injection of code mutants, (iii) generation of assert statements, and (iv) code summarization. We pay particular attention in studying the role played by pre-training and multi-task fine-tuning on the model's performance. We show that (i) the T5 can achieve better performance as compared to state-of-the-art baselines; and (ii) while pre-training helps the model, not all tasks benefit from a multi-task fine-tuning." @default.
- W4285204876 created "2022-07-14" @default.
- W4285204876 creator A5009727039 @default.
- W4285204876 creator A5027300975 @default.
- W4285204876 creator A5031468932 @default.
- W4285204876 creator A5041262116 @default.
- W4285204876 creator A5056526226 @default.
- W4285204876 creator A5069505458 @default.
- W4285204876 creator A5079406478 @default.
- W4285204876 date "2023-04-01" @default.
- W4285204876 modified "2023-10-12" @default.
- W4285204876 title "Using Transfer Learning for Code-Related Tasks" @default.
- W4285204876 cites W1965194038 @default.
- W4285204876 cites W1971650562 @default.
- W4285204876 cites W1972141422 @default.
- W4285204876 cites W1975455521 @default.
- W4285204876 cites W2025791343 @default.
- W4285204876 cites W2027047406 @default.
- W4285204876 cites W2039168567 @default.
- W4285204876 cites W2042124591 @default.
- W4285204876 cites W2065489029 @default.
- W4285204876 cites W2077273779 @default.
- W4285204876 cites W2081749632 @default.
- W4285204876 cites W2084413241 @default.
- W4285204876 cites W2090878800 @default.
- W4285204876 cites W2117228548 @default.
- W4285204876 cites W2133333349 @default.
- W4285204876 cites W2142537222 @default.
- W4285204876 cites W2143861926 @default.
- W4285204876 cites W2143960295 @default.
- W4285204876 cites W2145373440 @default.
- W4285204876 cites W2157331557 @default.
- W4285204876 cites W2166879716 @default.
- W4285204876 cites W2242083635 @default.
- W4285204876 cites W2294980783 @default.
- W4285204876 cites W2511803001 @default.
- W4285204876 cites W2516621648 @default.
- W4285204876 cites W2619636279 @default.
- W4285204876 cites W2736762043 @default.
- W4285204876 cites W2739564891 @default.
- W4285204876 cites W2740220421 @default.
- W4285204876 cites W2883359218 @default.
- W4285204876 cites W2884276923 @default.
- W4285204876 cites W2884681705 @default.
- W4285204876 cites W2888312537 @default.
- W4285204876 cites W2954149564 @default.
- W4285204876 cites W2954823997 @default.
- W4285204876 cites W2963026768 @default.
- W4285204876 cites W2963250244 @default.
- W4285204876 cites W2963979492 @default.
- W4285204876 cites W2964194820 @default.
- W4285204876 cites W2964322208 @default.
- W4285204876 cites W2967096374 @default.
- W4285204876 cites W2972082064 @default.
- W4285204876 cites W2979679630 @default.
- W4285204876 cites W2980731667 @default.
- W4285204876 cites W2993007949 @default.
- W4285204876 cites W3016234956 @default.
- W4285204876 cites W3027067538 @default.
- W4285204876 cites W3084812981 @default.
- W4285204876 cites W3101506519 @default.
- W4285204876 cites W3105903381 @default.
- W4285204876 cites W3108032709 @default.
- W4285204876 cites W3161903544 @default.
- W4285204876 cites W3161997752 @default.
- W4285204876 cites W3205927779 @default.
- W4285204876 cites W3211801722 @default.
- W4285204876 cites W3215668407 @default.
- W4285204876 cites W4231241365 @default.
- W4285204876 cites W4245415816 @default.
- W4285204876 cites W4254753190 @default.
- W4285204876 doi "https://doi.org/10.1109/tse.2022.3183297" @default.
- W4285204876 hasPublicationYear "2023" @default.
- W4285204876 type Work @default.
- W4285204876 citedByCount "1" @default.
- W4285204876 countsByYear W42852048762023 @default.
- W4285204876 crossrefType "journal-article" @default.
- W4285204876 hasAuthorship W4285204876A5009727039 @default.
- W4285204876 hasAuthorship W4285204876A5027300975 @default.
- W4285204876 hasAuthorship W4285204876A5031468932 @default.
- W4285204876 hasAuthorship W4285204876A5041262116 @default.
- W4285204876 hasAuthorship W4285204876A5056526226 @default.
- W4285204876 hasAuthorship W4285204876A5069505458 @default.
- W4285204876 hasAuthorship W4285204876A5079406478 @default.
- W4285204876 hasBestOaLocation W42852048762 @default.
- W4285204876 hasConcept C108583219 @default.
- W4285204876 hasConcept C119857082 @default.
- W4285204876 hasConcept C121332964 @default.
- W4285204876 hasConcept C137293760 @default.
- W4285204876 hasConcept C150899416 @default.
- W4285204876 hasConcept C154945302 @default.
- W4285204876 hasConcept C162324750 @default.
- W4285204876 hasConcept C165801399 @default.
- W4285204876 hasConcept C170858558 @default.
- W4285204876 hasConcept C177264268 @default.
- W4285204876 hasConcept C187736073 @default.
- W4285204876 hasConcept C199360897 @default.
- W4285204876 hasConcept C203005215 @default.