Matches in SemOpenAlex for { <https://semopenalex.org/work/W2895835746> ?p ?o ?g. }
Showing items 1 to 77 of
77
with 100 items per page.
- W2895835746 abstract "The goal of automatically encoding natural language textinto some formal representation has been pursued in the field of Knowledge Engineering to support the construction of Formal Ontologies. Many SOA{} methods have been proposedfor the automatic extraction of lightweight Ontologies and to populate them. Only few have tackled the challenge of extracting expressive axioms that formalize the possibly complex semantics of ontological concepts.In this thesis, we address the problem of encoding a natural language sentence expressing the description of a concept into a corresponding Description Logic axiom. In our approach, the encoding happens through a syntactic transformation, so that all the extralogical symbols in the formula are words actually occurring in the input sentence. We followed the recent advances in the field of DeepLearning in order to design suitable Neural Network architectures capable to learn by examples how to perform this transformation. Since no pre-existing dataset was available to adequately train Neural Networks for this task, we designed a data generation pipeline to produce datasets to train and evaluate the architectures proposed in this thesis. These datasets provide therefore a first reference corpus for the task of learning concept description axioms from text via Machine Learning techniques, and are nowavailable for the Knowledge Engineering community to fill the pre-existing lack of data.During our evaluation, we assessed some key characteristics of the approach we propose. First, we evaluated the capability of the trained models to generalizeover the syntactic structures used in the expression of concept descriptions, together with the tolerance to unknown words. The importance of these characteristics is due to the fact that Machine Learning systems are trained ona statistical sample of the problem space, and they have to learn to generalize over this sample in order to process new inputs. In particular, in our scenario, even an extremely large training set is not able to include all the possibleways a human can express the definition of a concept. At the same time, part of the human vocabulary is likely to fall out of the training set. Thus, testing these generalization capabilities and the tolerance to unknown words is crucialto evaluate the effectiveness of the model. Second, we evaluated the improvement in the performance of the model when it is incrementally trained with additionaltraining examples. This is also a pivotal characteristic of our approach, since Machine Learning-based systems are typically supposed to continuously evolve and improve, on the long term, through iterative repetitions of training setenlargements and training process runs. Therefore, a valuable model has to show the ability to improve its performance when new training examples are added tothe training set.To the best of our knowledge, this work represents the first assessment of an approach to the problem of encoding expressive concept descriptions from text that is entirely Machine Learning-based and is trained in a end-to-end fashion starting from raw text. In detail, this thesis proposes the first two Neural Network architectures in literature to solve the problem together with theirevaluation with respect to the above pivotal characteristics, and a first dataset generation pipeline together with concrete datasets." @default.
- W2895835746 created "2018-10-26" @default.
- W2895835746 creator A5012562128 @default.
- W2895835746 date "2018-09-21" @default.
- W2895835746 modified "2023-09-26" @default.
- W2895835746 title "Learning to Learn Concept Descriptions" @default.
- W2895835746 hasPublicationYear "2018" @default.
- W2895835746 type Work @default.
- W2895835746 sameAs 2895835746 @default.
- W2895835746 citedByCount "0" @default.
- W2895835746 crossrefType "dissertation" @default.
- W2895835746 hasAuthorship W2895835746A5012562128 @default.
- W2895835746 hasConcept C125411270 @default.
- W2895835746 hasConcept C154945302 @default.
- W2895835746 hasConcept C162324750 @default.
- W2895835746 hasConcept C167729594 @default.
- W2895835746 hasConcept C17744445 @default.
- W2895835746 hasConcept C187736073 @default.
- W2895835746 hasConcept C199360897 @default.
- W2895835746 hasConcept C199539241 @default.
- W2895835746 hasConcept C202444582 @default.
- W2895835746 hasConcept C204321447 @default.
- W2895835746 hasConcept C2524010 @default.
- W2895835746 hasConcept C2776359362 @default.
- W2895835746 hasConcept C2777530160 @default.
- W2895835746 hasConcept C2780451532 @default.
- W2895835746 hasConcept C33923547 @default.
- W2895835746 hasConcept C41008148 @default.
- W2895835746 hasConcept C43521106 @default.
- W2895835746 hasConcept C94625758 @default.
- W2895835746 hasConcept C9652623 @default.
- W2895835746 hasConceptScore W2895835746C125411270 @default.
- W2895835746 hasConceptScore W2895835746C154945302 @default.
- W2895835746 hasConceptScore W2895835746C162324750 @default.
- W2895835746 hasConceptScore W2895835746C167729594 @default.
- W2895835746 hasConceptScore W2895835746C17744445 @default.
- W2895835746 hasConceptScore W2895835746C187736073 @default.
- W2895835746 hasConceptScore W2895835746C199360897 @default.
- W2895835746 hasConceptScore W2895835746C199539241 @default.
- W2895835746 hasConceptScore W2895835746C202444582 @default.
- W2895835746 hasConceptScore W2895835746C204321447 @default.
- W2895835746 hasConceptScore W2895835746C2524010 @default.
- W2895835746 hasConceptScore W2895835746C2776359362 @default.
- W2895835746 hasConceptScore W2895835746C2777530160 @default.
- W2895835746 hasConceptScore W2895835746C2780451532 @default.
- W2895835746 hasConceptScore W2895835746C33923547 @default.
- W2895835746 hasConceptScore W2895835746C41008148 @default.
- W2895835746 hasConceptScore W2895835746C43521106 @default.
- W2895835746 hasConceptScore W2895835746C94625758 @default.
- W2895835746 hasConceptScore W2895835746C9652623 @default.
- W2895835746 hasLocation W28958357461 @default.
- W2895835746 hasOpenAccess W2895835746 @default.
- W2895835746 hasPrimaryLocation W28958357461 @default.
- W2895835746 hasRelatedWork W1412015892 @default.
- W2895835746 hasRelatedWork W1515692533 @default.
- W2895835746 hasRelatedWork W1520887062 @default.
- W2895835746 hasRelatedWork W1600788154 @default.
- W2895835746 hasRelatedWork W1677904636 @default.
- W2895835746 hasRelatedWork W1772653913 @default.
- W2895835746 hasRelatedWork W1856419327 @default.
- W2895835746 hasRelatedWork W2122308921 @default.
- W2895835746 hasRelatedWork W2158615804 @default.
- W2895835746 hasRelatedWork W2240922584 @default.
- W2895835746 hasRelatedWork W2787334611 @default.
- W2895835746 hasRelatedWork W2803178218 @default.
- W2895835746 hasRelatedWork W2945429194 @default.
- W2895835746 hasRelatedWork W2950169471 @default.
- W2895835746 hasRelatedWork W2993633978 @default.
- W2895835746 hasRelatedWork W3004632877 @default.
- W2895835746 hasRelatedWork W3007125878 @default.
- W2895835746 hasRelatedWork W3030535585 @default.
- W2895835746 hasRelatedWork W3172267148 @default.
- W2895835746 hasRelatedWork W244792188 @default.
- W2895835746 isParatext "false" @default.
- W2895835746 isRetracted "false" @default.
- W2895835746 magId "2895835746" @default.
- W2895835746 workType "dissertation" @default.