Matches in SemOpenAlex for { <https://semopenalex.org/work/W4312636974> ?p ?o ?g. }
- W4312636974 endingPage "116958" @default.
- W4312636974 startingPage "116942" @default.
- W4312636974 abstract "In this paper, we explore the possibility to apply natural language processing in visual model-to-model (M2M) transformations. Therefore, we present our research results on information extraction from text labels in process models modeled using Business Process Modeling Notation (BPMN) and use case models depicted in Unified Modeling Language (UML) using the most recent developments in natural language processing (NLP). In this paper, we focus on three relevant tasks, namely, the extraction of verb/noun phrases that would be used to form relations, parsing of conjunctive/disjunctive statements, and the detection of abbreviations and acronyms. Relation extraction was attempted to solve by implementing techniques that combine state-of-the-art NLP language models with formal regular expressions grammar-based structure detection. In this paper, we perform thorough testing of the most recent state-of-the-art NLP tools (CoreNLP, Stanford Stanza, Flair, Spacy, AllenNLP, BERT, ELECTRA), as well as custom BERT-BiLSTM-CRF and ELMo-BiLSTM-CRF implementations, trained with certain data augmentations to improve performance on the most ambiguous cases; these tools are used as a foundation for building tools to extract noun and verb phrases from short text labels generally used in UML and BPMN models. Furthermore, we describe our attempts to improve these extractors by solving the abbreviation/acronym detection problem using machine learning-based detection, as well as process conjunctive and disjunctive statements, due to their relevance to performing advanced text normalization. The obtained results show that the best phrase extraction and conjunctive phrase processing performance was obtained using Stanza based implementation, yet, our trained BERT-BiLSTM-CRF outperformed it for the verb phrase detection task. Our acronym detection approach resulted in the precision of 0.78 and F1-Score of 0.73 which may also be considered quite positive. While this work was inspired by our ongoing research on partial model-to-model transformations, we believe it to be applicable in other areas requiring similar text processing capabilities as well." @default.
- W4312636974 created "2023-01-05" @default.
- W4312636974 creator A5003416961 @default.
- W4312636974 creator A5040640124 @default.
- W4312636974 date "2022-01-01" @default.
- W4312636974 modified "2023-10-17" @default.
- W4312636974 title "Exploring Natural Language Processing in Model-To-Model Transformations" @default.
- W4312636974 cites W1930624869 @default.
- W4312636974 cites W1991154713 @default.
- W4312636974 cites W1996430422 @default.
- W4312636974 cites W2043147031 @default.
- W4312636974 cites W2061167720 @default.
- W4312636974 cites W2068737686 @default.
- W4312636974 cites W2081580037 @default.
- W4312636974 cites W2101706889 @default.
- W4312636974 cites W2102556303 @default.
- W4312636974 cites W2103931177 @default.
- W4312636974 cites W2107598941 @default.
- W4312636974 cites W2123442489 @default.
- W4312636974 cites W2139621418 @default.
- W4312636974 cites W2251534602 @default.
- W4312636974 cites W2251599843 @default.
- W4312636974 cites W2287187499 @default.
- W4312636974 cites W2296283641 @default.
- W4312636974 cites W2484437231 @default.
- W4312636974 cites W2512597464 @default.
- W4312636974 cites W2515384205 @default.
- W4312636974 cites W2519283440 @default.
- W4312636974 cites W2525961169 @default.
- W4312636974 cites W2567181374 @default.
- W4312636974 cites W2594363579 @default.
- W4312636974 cites W2600702321 @default.
- W4312636974 cites W2601354271 @default.
- W4312636974 cites W2740765036 @default.
- W4312636974 cites W2741095062 @default.
- W4312636974 cites W2756381707 @default.
- W4312636974 cites W2790851984 @default.
- W4312636974 cites W2799072540 @default.
- W4312636974 cites W2800905795 @default.
- W4312636974 cites W2804387108 @default.
- W4312636974 cites W2805348366 @default.
- W4312636974 cites W2886646161 @default.
- W4312636974 cites W2911964244 @default.
- W4312636974 cites W2953359159 @default.
- W4312636974 cites W2962739339 @default.
- W4312636974 cites W2963560594 @default.
- W4312636974 cites W2963591283 @default.
- W4312636974 cites W2963625095 @default.
- W4312636974 cites W2963691697 @default.
- W4312636974 cites W2964047910 @default.
- W4312636974 cites W2964081655 @default.
- W4312636974 cites W2964082031 @default.
- W4312636974 cites W2965721744 @default.
- W4312636974 cites W2969783751 @default.
- W4312636974 cites W2970521905 @default.
- W4312636974 cites W2971173350 @default.
- W4312636974 cites W2976765180 @default.
- W4312636974 cites W3011594683 @default.
- W4312636974 cites W3021523115 @default.
- W4312636974 cites W3032553678 @default.
- W4312636974 cites W3035058125 @default.
- W4312636974 cites W3035537500 @default.
- W4312636974 cites W3037109418 @default.
- W4312636974 cites W3088164486 @default.
- W4312636974 cites W3088470625 @default.
- W4312636974 cites W3089393845 @default.
- W4312636974 cites W3100754494 @default.
- W4312636974 cites W3102476541 @default.
- W4312636974 cites W3117798239 @default.
- W4312636974 cites W3118264630 @default.
- W4312636974 cites W3131324121 @default.
- W4312636974 cites W3132069728 @default.
- W4312636974 cites W3154732937 @default.
- W4312636974 cites W3174985167 @default.
- W4312636974 cites W4246711535 @default.
- W4312636974 cites W2606666989 @default.
- W4312636974 doi "https://doi.org/10.1109/access.2022.3219455" @default.
- W4312636974 hasPublicationYear "2022" @default.
- W4312636974 type Work @default.
- W4312636974 citedByCount "0" @default.
- W4312636974 crossrefType "journal-article" @default.
- W4312636974 hasAuthorship W4312636974A5003416961 @default.
- W4312636974 hasAuthorship W4312636974A5040640124 @default.
- W4312636974 hasBestOaLocation W43126369741 @default.
- W4312636974 hasConcept C121934690 @default.
- W4312636974 hasConcept C153962237 @default.
- W4312636974 hasConcept C154945302 @default.
- W4312636974 hasConcept C157659113 @default.
- W4312636974 hasConcept C195807954 @default.
- W4312636974 hasConcept C204321447 @default.
- W4312636974 hasConcept C41008148 @default.
- W4312636974 hasConceptScore W4312636974C121934690 @default.
- W4312636974 hasConceptScore W4312636974C153962237 @default.
- W4312636974 hasConceptScore W4312636974C154945302 @default.
- W4312636974 hasConceptScore W4312636974C157659113 @default.
- W4312636974 hasConceptScore W4312636974C195807954 @default.
- W4312636974 hasConceptScore W4312636974C204321447 @default.
- W4312636974 hasConceptScore W4312636974C41008148 @default.