Matches in SemOpenAlex for { <https://semopenalex.org/work/W2768751110> ?p ?o ?g. }
Showing items 1 to 69 of
69
with 100 items per page.
- W2768751110 abstract "This interdisciplinary work proposes new hierarchical classification algorithms and evaluates them on biological datasets, and specifically on ageing-related datasets. classification is a type of classification task where the classes to be predicted are organized into a hierarchical structure. The focus on ageing is justified by the increasing impact that ageing-related diseases have on the human population and by the increasing amount of freely available ageing-related data.The main contributions of this thesis are as follows. First, we improve the running time of a previously proposed hierarchical classification algorithm based on an extension of the well-known Naive Bayes classification algorithm. We show that our modification greatly improves the runtime of the hierarchical classification algorithm, maintaining its predictive performance.We also propose four new hierarchical classification algorithms. The focus on hierarchical classification algorithms and their evaluation on biological data is justified as the class labels of biological data are commonly organized into class hierarchies. Two of our four new hierarchical classification algorithms - the Hierarchical Dependence (HDN) and the Hierarchical Dependence Network algorithm based on finding non-Hierarchically related Predictive Classes'' (HDN-nHPC) - are based on Dependence Networks, a relatively new type of probabilistic graphical model that has not yet received a lot of attention from the classification community. The other two hierarchical classification algorithms we proposed are hybrid algorithms that use the hierarchical classification models produced by the Predictive Clustering Tree (PCT) algorithm. One of the hybrids combines the models produced by the PCT algorithm and a Local Classification (LHC) algorithm (which basically induces a local model for each class in the hierarchy). The other hybrid combines the models produced by the PCT and HDN algorithms.We have tested our four proposed algorithms and four other commonly used hierarchical classification algorithms on 42 hierarchical classification datasets. 20 of these datasets were created by us and are freely available for researchers. We have concluded that, for one out of the three hierarchical predictive accuracy measures used in our experiments, one of our four new algorithms (the HDN-nHPC algorithm) outperforms all other seven algorithms in terms of average rank across the 42 hierarchical classification datasets.We have also proposed the first meta-learning approach for hierarchical classification problems. In meta-learning, each meta-instance represents a dataset, meta-features represent dataset properties, and meta-classes represent the best classification algorithm for the corresponding dataset (meta-instance). Hence, meta-learning techniques for classification use the predictive performance of some candidate classification algorithms in previously tested datasets, and dataset descriptors (the meta-features), to infer the performance of those candidate classification algorithms in new datasets, given the meta-features of those new datasets.The predictions of our meta-learning system can be used as a guide to choose which hierarchical classification algorithm (out of a set of candidate ones) to use on a new dataset, without the need for time-consuming trial and error experiments with those candidate algorithms. This is particularly important for hierarchical classification problems, as the training time of hierarchical classification algorithms tends to be much greater than the training time of 'flat' classification algorithms. This increased training time is mainly due to the typically much greater number of class labels that annotate the instances of hierarchical classification problems.We have tested the predictive power of our meta-learning system and interpreted some generated meta-models. We have concluded that our meta-learning system had good predictive performance when compared to other baseline meta-learning approaches. We have also concluded that the meta-rules generated by our meta-learning system were useful to identify dataset characteristics to assist the choice of hierarchical classification algorithm.Finally, we have reviewed the current practice of applying supervised machine learning (classification and regression) algorithms to study the biology of ageing. This review discusses the main findings of such algorithms, in the context of the ageing biology literature. We have also interpreted some of the hierarchical classification models generated in our experiments. Both the above literature review and the interpretation of some models were performed in collaboration with an ageing expert, in order to extract relevant information for ageing research." @default.
- W2768751110 created "2017-12-04" @default.
- W2768751110 creator A5013525080 @default.
- W2768751110 date "2017-09-01" @default.
- W2768751110 modified "2023-09-28" @default.
- W2768751110 title "New Probabilistic Graphical Models and Meta-Learning Approaches for Hierarchical Classification, with Applications in Bioinformatics and Ageing" @default.
- W2768751110 hasPublicationYear "2017" @default.
- W2768751110 type Work @default.
- W2768751110 sameAs 2768751110 @default.
- W2768751110 citedByCount "0" @default.
- W2768751110 crossrefType "dissertation" @default.
- W2768751110 hasAuthorship W2768751110A5013525080 @default.
- W2768751110 hasConcept C110083411 @default.
- W2768751110 hasConcept C11413529 @default.
- W2768751110 hasConcept C119857082 @default.
- W2768751110 hasConcept C120665830 @default.
- W2768751110 hasConcept C121332964 @default.
- W2768751110 hasConcept C12267149 @default.
- W2768751110 hasConcept C124101348 @default.
- W2768751110 hasConcept C154945302 @default.
- W2768751110 hasConcept C192209626 @default.
- W2768751110 hasConcept C2777212361 @default.
- W2768751110 hasConcept C41008148 @default.
- W2768751110 hasConcept C49937458 @default.
- W2768751110 hasConcept C52001869 @default.
- W2768751110 hasConcept C73555534 @default.
- W2768751110 hasConcept C92835128 @default.
- W2768751110 hasConceptScore W2768751110C110083411 @default.
- W2768751110 hasConceptScore W2768751110C11413529 @default.
- W2768751110 hasConceptScore W2768751110C119857082 @default.
- W2768751110 hasConceptScore W2768751110C120665830 @default.
- W2768751110 hasConceptScore W2768751110C121332964 @default.
- W2768751110 hasConceptScore W2768751110C12267149 @default.
- W2768751110 hasConceptScore W2768751110C124101348 @default.
- W2768751110 hasConceptScore W2768751110C154945302 @default.
- W2768751110 hasConceptScore W2768751110C192209626 @default.
- W2768751110 hasConceptScore W2768751110C2777212361 @default.
- W2768751110 hasConceptScore W2768751110C41008148 @default.
- W2768751110 hasConceptScore W2768751110C49937458 @default.
- W2768751110 hasConceptScore W2768751110C52001869 @default.
- W2768751110 hasConceptScore W2768751110C73555534 @default.
- W2768751110 hasConceptScore W2768751110C92835128 @default.
- W2768751110 hasLocation W27687511101 @default.
- W2768751110 hasOpenAccess W2768751110 @default.
- W2768751110 hasPrimaryLocation W27687511101 @default.
- W2768751110 hasRelatedWork W1020256256 @default.
- W2768751110 hasRelatedWork W1569704244 @default.
- W2768751110 hasRelatedWork W1618913257 @default.
- W2768751110 hasRelatedWork W172634345 @default.
- W2768751110 hasRelatedWork W1995292650 @default.
- W2768751110 hasRelatedWork W2011929826 @default.
- W2768751110 hasRelatedWork W2113233913 @default.
- W2768751110 hasRelatedWork W2119554001 @default.
- W2768751110 hasRelatedWork W2199542590 @default.
- W2768751110 hasRelatedWork W2559713677 @default.
- W2768751110 hasRelatedWork W2801814382 @default.
- W2768751110 hasRelatedWork W2817593237 @default.
- W2768751110 hasRelatedWork W2887473762 @default.
- W2768751110 hasRelatedWork W2896447991 @default.
- W2768751110 hasRelatedWork W2898576334 @default.
- W2768751110 hasRelatedWork W2988087211 @default.
- W2768751110 hasRelatedWork W3113997496 @default.
- W2768751110 hasRelatedWork W3137802172 @default.
- W2768751110 hasRelatedWork W3162849074 @default.
- W2768751110 hasRelatedWork W2594692762 @default.
- W2768751110 isParatext "false" @default.
- W2768751110 isRetracted "false" @default.
- W2768751110 magId "2768751110" @default.
- W2768751110 workType "dissertation" @default.