Matches in SemOpenAlex for { <https://semopenalex.org/work/W4312221408> ?p ?o ?g. }
Showing items 1 to 75 of
75
with 100 items per page.
- W4312221408 endingPage "103227" @default.
- W4312221408 startingPage "103227" @default.
- W4312221408 abstract "Recently, the Transformer model architecture and the pre-trained Transformer-based language models have shown impressive performance when used in solving both natural language understanding and text generation tasks. Nevertheless, there is little research done on using these models for text generation in Arabic. This research aims at leveraging and comparing the performance of different model architectures, including RNN-based and Transformer-based ones, and different pre-trained language models, including mBERT, AraBERT, AraGPT2, and AraT5 for Arabic abstractive summarization. We first built an Arabic summarization dataset of 84,764 high-quality text-summary pairs. To use mBERT and AraBERT in the context of text summarization, we employed a BERT2BERT-based encoder-decoder model where we initialized both the encoder and decoder with the respective model weights. The proposed models have been tested using ROUGE metrics and manual human evaluation. We also compared their performance on out-of-domain data. Our pre-trained Transformer-based models give a large improvement in performance with ∼79% less data. We found that AraT5 scores ∼3 ROUGE higher than a BERT2BERT-based model that is initialized with AraBERT, indicating that an encoder-decoder pre-trained Transformer is more suitable for summarizing Arabic text. Also, both of these two models perform better than AraGPT2 by a clear margin, which we found to produce summaries with high readability but with relatively lesser quality. On the other hand, we found that both AraT5 and AraGPT2 are better at summarizing out-of-domain text. We released our models and dataset publicly1,.2" @default.
- W4312221408 created "2023-01-04" @default.
- W4312221408 creator A5072241658 @default.
- W4312221408 creator A5075438802 @default.
- W4312221408 date "2023-03-01" @default.
- W4312221408 modified "2023-10-11" @default.
- W4312221408 title "Arabic abstractive text summarization using RNN-based and transformer-based architectures" @default.
- W4312221408 cites W2795243228 @default.
- W4312221408 cites W2909602489 @default.
- W4312221408 cites W3005700414 @default.
- W4312221408 cites W3011574394 @default.
- W4312221408 cites W3036120435 @default.
- W4312221408 cites W3110700463 @default.
- W4312221408 cites W3198659451 @default.
- W4312221408 cites W4200068734 @default.
- W4312221408 cites W4205975503 @default.
- W4312221408 cites W4223898079 @default.
- W4312221408 doi "https://doi.org/10.1016/j.ipm.2022.103227" @default.
- W4312221408 hasPublicationYear "2023" @default.
- W4312221408 type Work @default.
- W4312221408 citedByCount "5" @default.
- W4312221408 countsByYear W43122214082023 @default.
- W4312221408 crossrefType "journal-article" @default.
- W4312221408 hasAuthorship W4312221408A5072241658 @default.
- W4312221408 hasAuthorship W4312221408A5075438802 @default.
- W4312221408 hasConcept C111919701 @default.
- W4312221408 hasConcept C118505674 @default.
- W4312221408 hasConcept C121332964 @default.
- W4312221408 hasConcept C137293760 @default.
- W4312221408 hasConcept C138885662 @default.
- W4312221408 hasConcept C154945302 @default.
- W4312221408 hasConcept C165801399 @default.
- W4312221408 hasConcept C170858558 @default.
- W4312221408 hasConcept C204321447 @default.
- W4312221408 hasConcept C2985684807 @default.
- W4312221408 hasConcept C41008148 @default.
- W4312221408 hasConcept C41895202 @default.
- W4312221408 hasConcept C62520636 @default.
- W4312221408 hasConcept C66322947 @default.
- W4312221408 hasConcept C96455323 @default.
- W4312221408 hasConceptScore W4312221408C111919701 @default.
- W4312221408 hasConceptScore W4312221408C118505674 @default.
- W4312221408 hasConceptScore W4312221408C121332964 @default.
- W4312221408 hasConceptScore W4312221408C137293760 @default.
- W4312221408 hasConceptScore W4312221408C138885662 @default.
- W4312221408 hasConceptScore W4312221408C154945302 @default.
- W4312221408 hasConceptScore W4312221408C165801399 @default.
- W4312221408 hasConceptScore W4312221408C170858558 @default.
- W4312221408 hasConceptScore W4312221408C204321447 @default.
- W4312221408 hasConceptScore W4312221408C2985684807 @default.
- W4312221408 hasConceptScore W4312221408C41008148 @default.
- W4312221408 hasConceptScore W4312221408C41895202 @default.
- W4312221408 hasConceptScore W4312221408C62520636 @default.
- W4312221408 hasConceptScore W4312221408C66322947 @default.
- W4312221408 hasConceptScore W4312221408C96455323 @default.
- W4312221408 hasIssue "2" @default.
- W4312221408 hasLocation W43122214081 @default.
- W4312221408 hasOpenAccess W4312221408 @default.
- W4312221408 hasPrimaryLocation W43122214081 @default.
- W4312221408 hasRelatedWork W2747680751 @default.
- W4312221408 hasRelatedWork W2945886944 @default.
- W4312221408 hasRelatedWork W2969740599 @default.
- W4312221408 hasRelatedWork W2985808369 @default.
- W4312221408 hasRelatedWork W3128902667 @default.
- W4312221408 hasRelatedWork W4221140906 @default.
- W4312221408 hasRelatedWork W4316012698 @default.
- W4312221408 hasRelatedWork W4362451017 @default.
- W4312221408 hasRelatedWork W4362570706 @default.
- W4312221408 hasRelatedWork W4362598702 @default.
- W4312221408 hasVolume "60" @default.
- W4312221408 isParatext "false" @default.
- W4312221408 isRetracted "false" @default.
- W4312221408 workType "article" @default.