Matches in SemOpenAlex for { <https://semopenalex.org/work/W3200863653> ?p ?o ?g. }
- W3200863653 abstract "Learning accurate classifiers from preclassified data is a very active research topic in machine learning and artifcial intelligence. There are numerous classifier paradigms, among which Bayesian Networks are very effective and well known in domains with uncertainty. Bayesian Networks are widely used representation frameworks for reasoning with probabilistic information. These models use graphs to capture dependence and independence relationships between feature variables, allowing a concise representation of the knowledge as well as efficient graph based query processing algorithms. This representation is defined by two components: structure learning and parameter learning. The structure of this model represents a directed acyclic graph. The nodes in the graph correspond to the feature variables in the domain, and the arcs (edges) show the causal relationships between feature variables. A directed edge relates the variables so that the variable corresponding to the terminal node (child) will be conditioned on the variable corresponding to the initial node (parent). The parameter learning represents probabilities and conditional probabilities based on prior information or past experience. The set of probabilities are represented in the conditional probability table. Once the network structure is constructed, the probabilistic inferences are readily calculated, and can be performed to predict the outcome of some variables based on the observations of others. However, the problem of structure learning is a complex problem since the number of candidate structures grows exponentially when the number of feature variables increases. This thesis is devoted to the development of learning structures and parameters in Bayesian Networks. Different models based on optimization techniques are introduced to construct an optimal structure of a Bayesian Network. These models also consider the improvement of the Naive Bayes' structure by developing new algorithms to alleviate the independence assumptions. We present various models to learn parameters of Bayesian Networks; in particular we propose optimization models for the Naive Bayes and the Tree Augmented Naive Bayes by considering different objective functions. To solve corresponding optimization problems in Bayesian Networks, we develop new optimization algorithms. Local optimization methods are introduced based on the combination of the gradient and Newton methods. It is proved that the proposed methods are globally convergent and have superlinear convergence rates. As a global search we use the global optimization method, AGOP, implemented in the open software library GANSO. We apply the proposed local methods in the combination with AGOP. Therefore, the main contributions of this thesis include (a) new algorithms for learning an optimal structure of a Bayesian Network; (b) new models for learning the parameters of Bayesian Networks with the given structures; and finally (c) new optimization algorithms for optimizing the proposed models in (a) and (b). To validate the proposed…" @default.
- W3200863653 created "2021-09-27" @default.
- W3200863653 creator A5077896617 @default.
- W3200863653 date "2012-01-01" @default.
- W3200863653 modified "2023-09-26" @default.
- W3200863653 title "Learning Bayesian networks based on optimization approaches" @default.
- W3200863653 cites W113578367 @default.
- W3200863653 cites W1490813563 @default.
- W3200863653 cites W1501858372 @default.
- W3200863653 cites W1507029541 @default.
- W3200863653 cites W1514758555 @default.
- W3200863653 cites W1523293200 @default.
- W3200863653 cites W1523680690 @default.
- W3200863653 cites W1524326598 @default.
- W3200863653 cites W1530964327 @default.
- W3200863653 cites W1535430927 @default.
- W3200863653 cites W1550377132 @default.
- W3200863653 cites W155356325 @default.
- W3200863653 cites W1563779243 @default.
- W3200863653 cites W1565377632 @default.
- W3200863653 cites W1588357372 @default.
- W3200863653 cites W1668763905 @default.
- W3200863653 cites W1669437150 @default.
- W3200863653 cites W1671463325 @default.
- W3200863653 cites W1676820704 @default.
- W3200863653 cites W1678889691 @default.
- W3200863653 cites W1755360231 @default.
- W3200863653 cites W1801737117 @default.
- W3200863653 cites W1912123407 @default.
- W3200863653 cites W1934306740 @default.
- W3200863653 cites W1968814228 @default.
- W3200863653 cites W1975551153 @default.
- W3200863653 cites W1979052604 @default.
- W3200863653 cites W1981218190 @default.
- W3200863653 cites W1990411923 @default.
- W3200863653 cites W1991025539 @default.
- W3200863653 cites W2003679080 @default.
- W3200863653 cites W2007598972 @default.
- W3200863653 cites W2008524883 @default.
- W3200863653 cites W2018571157 @default.
- W3200863653 cites W2042878414 @default.
- W3200863653 cites W2055037429 @default.
- W3200863653 cites W2066287192 @default.
- W3200863653 cites W2066718979 @default.
- W3200863653 cites W2069469807 @default.
- W3200863653 cites W2071126205 @default.
- W3200863653 cites W2086815218 @default.
- W3200863653 cites W2089133220 @default.
- W3200863653 cites W2092915639 @default.
- W3200863653 cites W2099900459 @default.
- W3200863653 cites W2101276256 @default.
- W3200863653 cites W2113001205 @default.
- W3200863653 cites W2115251651 @default.
- W3200863653 cites W2115746342 @default.
- W3200863653 cites W2116882041 @default.
- W3200863653 cites W2117783440 @default.
- W3200863653 cites W2118910129 @default.
- W3200863653 cites W2119394710 @default.
- W3200863653 cites W2121278962 @default.
- W3200863653 cites W2122003883 @default.
- W3200863653 cites W2125055259 @default.
- W3200863653 cites W2128535275 @default.
- W3200863653 cites W2137651144 @default.
- W3200863653 cites W2142390772 @default.
- W3200863653 cites W2150294521 @default.
- W3200863653 cites W2155593578 @default.
- W3200863653 cites W2157791002 @default.
- W3200863653 cites W2161369142 @default.
- W3200863653 cites W2161632986 @default.
- W3200863653 cites W2168175751 @default.
- W3200863653 cites W2170112109 @default.
- W3200863653 cites W2170653744 @default.
- W3200863653 cites W2171265988 @default.
- W3200863653 cites W2186581219 @default.
- W3200863653 cites W2189552671 @default.
- W3200863653 cites W2276580322 @default.
- W3200863653 cites W2296319761 @default.
- W3200863653 cites W2471988522 @default.
- W3200863653 cites W2481757588 @default.
- W3200863653 cites W2787523326 @default.
- W3200863653 cites W5056303 @default.
- W3200863653 cites W69257054 @default.
- W3200863653 cites W8978941 @default.
- W3200863653 cites W3149407881 @default.
- W3200863653 hasPublicationYear "2012" @default.
- W3200863653 type Work @default.
- W3200863653 sameAs 3200863653 @default.
- W3200863653 citedByCount "0" @default.
- W3200863653 crossrefType "journal-article" @default.
- W3200863653 hasAuthorship W3200863653A5077896617 @default.
- W3200863653 hasConcept C11413529 @default.
- W3200863653 hasConcept C119857082 @default.
- W3200863653 hasConcept C132525143 @default.
- W3200863653 hasConcept C138885662 @default.
- W3200863653 hasConcept C154945302 @default.
- W3200863653 hasConcept C155846161 @default.
- W3200863653 hasConcept C2776401178 @default.
- W3200863653 hasConcept C33724603 @default.
- W3200863653 hasConcept C33923547 @default.
- W3200863653 hasConcept C41008148 @default.