Matches in SemOpenAlex for { <https://semopenalex.org/work/W3210691267> ?p ?o ?g. }
Showing items 1 to 85 of
85
with 100 items per page.
- W3210691267 endingPage "9872" @default.
- W3210691267 startingPage "9872" @default.
- W3210691267 abstract "Most of the models proposed in the literature for abstractive summarization are generally suitable for the English language but not for other languages. Multilingual models were introduced to address that language constraint, but despite their applicability being broader than that of the monolingual models, their performance is typically lower, especially for minority languages like Catalan. In this paper, we present a monolingual model for abstractive summarization of textual content in the Catalan language. The model is a Transformer encoder-decoder which is pretrained and fine-tuned specifically for the Catalan language using a corpus of newspaper articles. In the pretraining phase, we introduced several self-supervised tasks to specialize the model on the summarization task and to increase the abstractivity of the generated summaries. To study the performance of our proposal in languages with higher resources than Catalan, we replicate the model and the experimentation for the Spanish language. The usual evaluation metrics, not only the most used ROUGE measure but also other more semantic ones such as BertScore, do not allow to correctly evaluate the abstractivity of the generated summaries. In this work, we also present a new metric, called content reordering, to evaluate one of the most common characteristics of abstractive summaries, the rearrangement of the original content. We carried out an exhaustive experimentation to compare the performance of the monolingual models proposed in this work with two of the most widely used multilingual models in text summarization, mBART and mT5. The experimentation results support the quality of our monolingual models, especially considering that the multilingual models were pretrained with many more resources than those used in our models. Likewise, it is shown that the pretraining tasks helped to increase the degree of abstractivity of the generated summaries. To our knowledge, this is the first work that explores a monolingual approach for abstractive summarization both in Catalan and Spanish." @default.
- W3210691267 created "2021-11-08" @default.
- W3210691267 creator A5005124189 @default.
- W3210691267 creator A5022565589 @default.
- W3210691267 creator A5043133231 @default.
- W3210691267 creator A5046092986 @default.
- W3210691267 date "2021-10-22" @default.
- W3210691267 modified "2023-10-15" @default.
- W3210691267 title "NASca and NASes: Two Monolingual Pre-Trained Models for Abstractive Summarization in Catalan and Spanish" @default.
- W3210691267 cites W2115167129 @default.
- W3210691267 cites W2158065945 @default.
- W3210691267 cites W2606974598 @default.
- W3210691267 cites W2944770383 @default.
- W3210691267 cites W2952638691 @default.
- W3210691267 cites W2962849707 @default.
- W3210691267 cites W2962965405 @default.
- W3210691267 cites W2963227052 @default.
- W3210691267 cites W2963929190 @default.
- W3210691267 cites W2970419734 @default.
- W3210691267 cites W3001434439 @default.
- W3210691267 cites W3027042170 @default.
- W3210691267 cites W3034999214 @default.
- W3210691267 cites W3035050380 @default.
- W3210691267 cites W3097370230 @default.
- W3210691267 cites W3098493824 @default.
- W3210691267 cites W3099286868 @default.
- W3210691267 cites W3169483174 @default.
- W3210691267 cites W4239027807 @default.
- W3210691267 doi "https://doi.org/10.3390/app11219872" @default.
- W3210691267 hasPublicationYear "2021" @default.
- W3210691267 type Work @default.
- W3210691267 sameAs 3210691267 @default.
- W3210691267 citedByCount "2" @default.
- W3210691267 countsByYear W32106912672022 @default.
- W3210691267 countsByYear W32106912672023 @default.
- W3210691267 crossrefType "journal-article" @default.
- W3210691267 hasAuthorship W3210691267A5005124189 @default.
- W3210691267 hasAuthorship W3210691267A5022565589 @default.
- W3210691267 hasAuthorship W3210691267A5043133231 @default.
- W3210691267 hasAuthorship W3210691267A5046092986 @default.
- W3210691267 hasBestOaLocation W32106912671 @default.
- W3210691267 hasConcept C111919701 @default.
- W3210691267 hasConcept C118505674 @default.
- W3210691267 hasConcept C137293760 @default.
- W3210691267 hasConcept C138885662 @default.
- W3210691267 hasConcept C154945302 @default.
- W3210691267 hasConcept C164105321 @default.
- W3210691267 hasConcept C170858558 @default.
- W3210691267 hasConcept C204321447 @default.
- W3210691267 hasConcept C41008148 @default.
- W3210691267 hasConcept C41895202 @default.
- W3210691267 hasConceptScore W3210691267C111919701 @default.
- W3210691267 hasConceptScore W3210691267C118505674 @default.
- W3210691267 hasConceptScore W3210691267C137293760 @default.
- W3210691267 hasConceptScore W3210691267C138885662 @default.
- W3210691267 hasConceptScore W3210691267C154945302 @default.
- W3210691267 hasConceptScore W3210691267C164105321 @default.
- W3210691267 hasConceptScore W3210691267C170858558 @default.
- W3210691267 hasConceptScore W3210691267C204321447 @default.
- W3210691267 hasConceptScore W3210691267C41008148 @default.
- W3210691267 hasConceptScore W3210691267C41895202 @default.
- W3210691267 hasFunder F4320315062 @default.
- W3210691267 hasFunder F4320321864 @default.
- W3210691267 hasIssue "21" @default.
- W3210691267 hasLocation W32106912671 @default.
- W3210691267 hasLocation W32106912672 @default.
- W3210691267 hasOpenAccess W3210691267 @default.
- W3210691267 hasPrimaryLocation W32106912671 @default.
- W3210691267 hasRelatedWork W2293457016 @default.
- W3210691267 hasRelatedWork W2747680751 @default.
- W3210691267 hasRelatedWork W2793376154 @default.
- W3210691267 hasRelatedWork W3192589309 @default.
- W3210691267 hasRelatedWork W3204019825 @default.
- W3210691267 hasRelatedWork W3210691267 @default.
- W3210691267 hasRelatedWork W4200068734 @default.
- W3210691267 hasRelatedWork W4221140906 @default.
- W3210691267 hasRelatedWork W4323363096 @default.
- W3210691267 hasRelatedWork W4362570706 @default.
- W3210691267 hasVolume "11" @default.
- W3210691267 isParatext "false" @default.
- W3210691267 isRetracted "false" @default.
- W3210691267 magId "3210691267" @default.
- W3210691267 workType "article" @default.