Matches in SemOpenAlex for { <https://semopenalex.org/work/W568924265> ?p ?o ?g. }
- W568924265 abstract "In recent years there has been an increased interest in applying non-parametric methods to real-world problems. Significant research has been devoted to Gaussian processes (GPs) due to their increased flexibility when compared with parametric models. These methods use Bayesian learning, which generally leads to analytically intractable posteriors. This thesis proposes a two-step solution to construct a probabilistic approximation to the posterior. In the first step we adapt the Bayesian online learning to GPs: the final approximation to the posterior is the result of propagating the first and second moments of intermediate posteriors obtained by combining a new example with the previous approximation. The propagation of em functional forms is solved by showing the existence of a parametrisation to posterior moments that uses combinations of the kernel function at the training points, transforming the Bayesian online learning of functions into a parametric formulation. The drawback is the prohibitive quadratic scaling of the number of parameters with the size of the data, making the method inapplicable to large datasets. The second step solves the problem of the exploding parameter size and makes GPs applicable to arbitrarily large datasets. The approximation is based on a measure of distance between two GPs, the KL-divergence between GPs. This second approximation is with a constrained GP in which only a small subset of the whole training dataset is used to represent the GP. This subset is called the em Basis Vector, or BV set and the resulting GP is a sparse approximation to the true posterior. As this sparsity is based on the KL-minimisation, it is probabilistic and independent of the way the posterior approximation from the first step is obtained. We combine the sparse approximation with an extension to the Bayesian online algorithm that allows multiple iterations for each input and thus approximating a batch solution. The resulting sparse learning algorithm is a generic one: for different problems we only change the likelihood. The algorithm is applied to a variety of problems and we examine its performance both on more classical regression and classification tasks and to the data-assimilation and a simple density estimation problems." @default.
- W568924265 created "2016-06-24" @default.
- W568924265 creator A5048206013 @default.
- W568924265 date "2002-03-01" @default.
- W568924265 modified "2023-09-26" @default.
- W568924265 title "Gaussian processes:iterative sparse approximations" @default.
- W568924265 cites W103771591 @default.
- W568924265 cites W1483816357 @default.
- W568924265 cites W1497193185 @default.
- W568924265 cites W1500174157 @default.
- W568924265 cites W1528905581 @default.
- W568924265 cites W1540155273 @default.
- W568924265 cites W1549656520 @default.
- W568924265 cites W1551209770 @default.
- W568924265 cites W1554663460 @default.
- W568924265 cites W1583555936 @default.
- W568924265 cites W1598808445 @default.
- W568924265 cites W1604938182 @default.
- W568924265 cites W1618393386 @default.
- W568924265 cites W1648445109 @default.
- W568924265 cites W1676317136 @default.
- W568924265 cites W1746680969 @default.
- W568924265 cites W1859781365 @default.
- W568924265 cites W1934021597 @default.
- W568924265 cites W1963533512 @default.
- W568924265 cites W1973310094 @default.
- W568924265 cites W1976990135 @default.
- W568924265 cites W1986931325 @default.
- W568924265 cites W1989730800 @default.
- W568924265 cites W2009511046 @default.
- W568924265 cites W2010713800 @default.
- W568924265 cites W2011226036 @default.
- W568924265 cites W2014158063 @default.
- W568924265 cites W2015336262 @default.
- W568924265 cites W2015904350 @default.
- W568924265 cites W2027230546 @default.
- W568924265 cites W2033839039 @default.
- W568924265 cites W2045656233 @default.
- W568924265 cites W2069543779 @default.
- W568924265 cites W2080021732 @default.
- W568924265 cites W2083402998 @default.
- W568924265 cites W2087603457 @default.
- W568924265 cites W2088032561 @default.
- W568924265 cites W2102201073 @default.
- W568924265 cites W2103633133 @default.
- W568924265 cites W2105934661 @default.
- W568924265 cites W2107152312 @default.
- W568924265 cites W2108807072 @default.
- W568924265 cites W2109816097 @default.
- W568924265 cites W2111611307 @default.
- W568924265 cites W2112545207 @default.
- W568924265 cites W2116723448 @default.
- W568924265 cites W2117063635 @default.
- W568924265 cites W2119388857 @default.
- W568924265 cites W2123687908 @default.
- W568924265 cites W2123838014 @default.
- W568924265 cites W2124776405 @default.
- W568924265 cites W2125027820 @default.
- W568924265 cites W2125141294 @default.
- W568924265 cites W2125452049 @default.
- W568924265 cites W2128659236 @default.
- W568924265 cites W2129564505 @default.
- W568924265 cites W2129809168 @default.
- W568924265 cites W2129869373 @default.
- W568924265 cites W2132211083 @default.
- W568924265 cites W2137512539 @default.
- W568924265 cites W2139212933 @default.
- W568924265 cites W2139479120 @default.
- W568924265 cites W2140095548 @default.
- W568924265 cites W2141274633 @default.
- W568924265 cites W2143022286 @default.
- W568924265 cites W2143956139 @default.
- W568924265 cites W2148694408 @default.
- W568924265 cites W2149842772 @default.
- W568924265 cites W2154371297 @default.
- W568924265 cites W2156909104 @default.
- W568924265 cites W2157005274 @default.
- W568924265 cites W2161767008 @default.
- W568924265 cites W2171343216 @default.
- W568924265 cites W2408196097 @default.
- W568924265 cites W2949744307 @default.
- W568924265 cites W2962776596 @default.
- W568924265 cites W3021510377 @default.
- W568924265 cites W3023786531 @default.
- W568924265 cites W304861154 @default.
- W568924265 cites W71499226 @default.
- W568924265 cites W94523489 @default.
- W568924265 cites W2065224931 @default.
- W568924265 hasPublicationYear "2002" @default.
- W568924265 type Work @default.
- W568924265 sameAs 568924265 @default.
- W568924265 citedByCount "66" @default.
- W568924265 countsByYear W5689242652012 @default.
- W568924265 countsByYear W5689242652013 @default.
- W568924265 countsByYear W5689242652014 @default.
- W568924265 countsByYear W5689242652015 @default.
- W568924265 countsByYear W5689242652016 @default.
- W568924265 countsByYear W5689242652017 @default.
- W568924265 countsByYear W5689242652018 @default.
- W568924265 countsByYear W5689242652019 @default.