Matches in SemOpenAlex for { <https://semopenalex.org/work/W2186459748> ?p ?o ?g. }
- W2186459748 abstract "Supervised learning in general and regularized risk minimization in particular is about solving optimization problem which is jointly defined by a performance measure and a set of labeled training examples. The outcome of learning, a model, is then used mainly for predicting the labels for unlabeled examples in the testing environment. In real-world scenarios, a typical learning process often involves solving a sequence of similar problems with different parameters before a final model is identified. For learning to be successful, the final model must be produced timely, and the model should be robust to (mild) irregularities in the testing environment. The purpose of this thesis is to investigate ways to speed up the learning process and improve the robustness of the learned model. We first develop a batch convex optimization solver specialized to the regularized risk minimization based on standard bundle methods. The solver inherits two main properties of the standard bundle methods. Firstly, it is capable of solving both differentiable and non-differentiable problems, hence its implementation can be reused for different tasks with minimal modification. Secondly, the optimization is easily amenable to parallel and distributed computation settings; this makes the solver highly scalable in the number of training examples. However, unlike the standard bundle methods, the solver does not have extra parameters which need careful tuning. Furthermore, we prove that the solver has faster convergence rate. In addition to that, the solver is very efficient in computing approximate regularization path and model selection. We also present a convex risk formulation for incorporating invariances and prior knowledge into the learning problem. This formulation generalizes many existing approaches for robust learning in the setting of insufficient or noisy training examples and covariate shift. Lastly, we extend a non-convex risk formulation for binary classification to structured prediction. Empirical results show that the model obtained with this risk formulation is robust to outliers in the training examples." @default.
- W2186459748 created "2016-06-24" @default.
- W2186459748 creator A5029252731 @default.
- W2186459748 date "2010-01-01" @default.
- W2186459748 modified "2023-09-26" @default.
- W2186459748 title "Bundle Methods for Regularized Risk Minimization with Applications to Robust Learning" @default.
- W2186459748 cites W147998453 @default.
- W2186459748 cites W1484982917 @default.
- W2186459748 cites W1485852130 @default.
- W2186459748 cites W1491531375 @default.
- W2186459748 cites W1502069036 @default.
- W2186459748 cites W1512098439 @default.
- W2186459748 cites W1515020792 @default.
- W2186459748 cites W1522068547 @default.
- W2186459748 cites W1532247067 @default.
- W2186459748 cites W1546961578 @default.
- W2186459748 cites W1555115156 @default.
- W2186459748 cites W1561572740 @default.
- W2186459748 cites W1569090332 @default.
- W2186459748 cites W1581001755 @default.
- W2186459748 cites W1593384057 @default.
- W2186459748 cites W1657213141 @default.
- W2186459748 cites W1663792126 @default.
- W2186459748 cites W1667072054 @default.
- W2186459748 cites W1714704734 @default.
- W2186459748 cites W1799024362 @default.
- W2186459748 cites W18289225 @default.
- W2186459748 cites W1871180460 @default.
- W2186459748 cites W1902387477 @default.
- W2186459748 cites W192333851 @default.
- W2186459748 cites W1964290766 @default.
- W2186459748 cites W1972950354 @default.
- W2186459748 cites W1978259121 @default.
- W2186459748 cites W1978669994 @default.
- W2186459748 cites W1979459040 @default.
- W2186459748 cites W1982032418 @default.
- W2186459748 cites W1982511487 @default.
- W2186459748 cites W1983599491 @default.
- W2186459748 cites W1985554184 @default.
- W2186459748 cites W1993343369 @default.
- W2186459748 cites W1993613781 @default.
- W2186459748 cites W199761101 @default.
- W2186459748 cites W2005688170 @default.
- W2186459748 cites W2008772536 @default.
- W2186459748 cites W2009593947 @default.
- W2186459748 cites W2014566476 @default.
- W2186459748 cites W2015263936 @default.
- W2186459748 cites W2021897193 @default.
- W2186459748 cites W2022150446 @default.
- W2186459748 cites W2022393739 @default.
- W2186459748 cites W2025732832 @default.
- W2186459748 cites W2029538739 @default.
- W2186459748 cites W2030811966 @default.
- W2186459748 cites W2031248101 @default.
- W2186459748 cites W2033468335 @default.
- W2186459748 cites W2033957915 @default.
- W2186459748 cites W2035720976 @default.
- W2186459748 cites W2039050532 @default.
- W2186459748 cites W2039110678 @default.
- W2186459748 cites W2041928877 @default.
- W2186459748 cites W2047092297 @default.
- W2186459748 cites W2051381803 @default.
- W2186459748 cites W2055586095 @default.
- W2186459748 cites W2062525688 @default.
- W2186459748 cites W2063978378 @default.
- W2186459748 cites W2069317438 @default.
- W2186459748 cites W2069808690 @default.
- W2186459748 cites W2070771761 @default.
- W2186459748 cites W2073226829 @default.
- W2186459748 cites W2074522218 @default.
- W2186459748 cites W2087347434 @default.
- W2186459748 cites W2091371007 @default.
- W2186459748 cites W2097584995 @default.
- W2186459748 cites W2101919228 @default.
- W2186459748 cites W2101974476 @default.
- W2186459748 cites W2102909657 @default.
- W2186459748 cites W2102943360 @default.
- W2186459748 cites W2103017472 @default.
- W2186459748 cites W2105636360 @default.
- W2186459748 cites W2105644991 @default.
- W2186459748 cites W2105842272 @default.
- W2186459748 cites W2105895401 @default.
- W2186459748 cites W2107275319 @default.
- W2186459748 cites W2108153239 @default.
- W2186459748 cites W2108423015 @default.
- W2186459748 cites W2110325612 @default.
- W2186459748 cites W2111296615 @default.
- W2186459748 cites W2112530506 @default.
- W2186459748 cites W2113651538 @default.
- W2186459748 cites W2114296159 @default.
- W2186459748 cites W2118585731 @default.
- W2186459748 cites W2119821739 @default.
- W2186459748 cites W2120340025 @default.
- W2186459748 cites W2123729042 @default.
- W2186459748 cites W2123737232 @default.
- W2186459748 cites W2125993116 @default.
- W2186459748 cites W2127091553 @default.
- W2186459748 cites W2129191766 @default.
- W2186459748 cites W2130698119 @default.
- W2186459748 cites W2132347401 @default.