Matches in SemOpenAlex for { <https://semopenalex.org/work/W194329854> ?p ?o ?g. }
Showing items 1 to 85 of
85
with 100 items per page.
- W194329854 abstract "In this thesis, we develop tractable relaxations and efficient algorithms for large-scale optimization. Our developments are motivated by a recent paradigm, Compressed Sensing, which covers a multitude of large-scale, sparsity-oriented convex optimization problems. Compressed sensing is focused on the recovery of sparse or well-concentrated signals from possibly noisy observations in a low-dimensional space. Nowadays, this theory is successfully utilized in many fields ranging from MRI image processing to machine learning, from biology to statistics. In the first chapter of this thesis, we provide a general introduction to compressed sensing and its applications and cover some of the earlier results. The majority of results in compressed sensing theory rely on the ability to design/use projection matrices with good recoverability properties. In the second chapter of this thesis, we study the conditions for good recoverability properties of a sensing matrix. We propose necessary and sufficient conditions for a sensing matrix to allow for exact e1-recovery of sparse signals with at most s nonzero entries while utilizing a priori information given in the form of sign restrictions on part of the entries. We express error bounds for imperfect e1-recovery in terms of the characteristics underlying these conditions. These characteristics, although difficult to evaluate, lead to two different verifiable sufficient conditions, which can be efficiently computed via linear programming (LP) and/or semidefinite programming (SDP) and thus generate efficiently computable lower bounds on the level of sparsity, s, for which a given sensing matrix is shown to allow for exact e1-recovery. We analyze the connection between our LP- and SDP-based verifiable sufficient conditions, examine their properties, describe their limits of performance and provide numerical examples comparing them with other verifiable conditions from the literature. Even though our LP- and SDP-based relaxations are presented in CS framework, these techniques are generic and applicable in the case of disjoint bilinear programs. In the third chapter, we study the compressed sensing synthesis problem – selecting the minimum number of rows from a given matrix, so that the resulting submatrix possesses certifiably good recovery properties. Starting from the verifiable sufficient conditions, we express the synthesis problem as the problem of approximating a given matrix by a matrix of specified low rank in the uniform norm. We develop a randomized algorithm for efficient construction of rank k approximation of matrices of size m × n achieving accuracy bounds O(1) lnmn /k which hold in expectation or with high probability. We supply a derandomized version of our approximation algorithm and provide numerical results on its performance for the synthesis problem. Chapter 4 is dedicated to efficient first-order algorithms for large-scale, well-structured convex optimization problems. Saddle point reformulation is proven to be an effective tool to exploit problem structure for designing computationally efficient algorithms. Building upon their strength, we first demonstrate that the solutions to many large-scale problems arising from compressed sensing recovery, high-dimensional statistical inference, and machine learning can be obtained through solving a series of Bilinear Saddle Point problems (BSPs). We accelerate the solution of associated single-parametric BSP's by utilizing the Mirror Prox algorithm from [101] as a prototype and by replacing precise first order oracle (which becomes quite time-consuming in the extremely large-scale case) by its computationally cheap randomized counterpart. In the overall solution of parametric BSPs, cheap online assessment of solution quality is crucial. Our randomized algorithms come with exact guarantees on solution quality and achieves sublinear time behavior to solve large-scale parametric BSPs. Extensive simulations show that our randomized first-order methods are capable of handling very large-scale applications and improve considerably over the state-of-the-art deterministic algorithms, with benefits amplifying as the sizes of the problems grow. In the fifth chapter, we examine a more general sparse estimation problem – estimating a signal from its undersampled observations corrupted with nuisance and stochastic noise. Instead of the standard sparse signal framework, here we work under the assumption that a priori information is presented via a block representation structure of a known linear transform of the signal, and the signal achieves a good approximation in block sparse sense in this representation structure. There are a number of important applications where such a nontrivial sparsifying representation arises naturally such as standard image reconstruction with Total Variation regularization or finding the solution of a linear finite-difference equation with sparse right hand side (“evolution of a linear plant corrected from time to time by impulse control”). We show that an extension of the standard compressed sensing results from [79] to this framework is possible. Particularly, we introduce a family of conditions, suggest two new methods of recovery based on block-e 1 minimization and study the most common cases of the block representation structure under which these estimators have efficiently verifiable guaranties of performance. We link our performance estimations to the well known results of compressed sensing by providing connections between our conditions and Restricted Isometry Property. This also establishes connections between new techniques and classical methods such as Lasso and Dantzig Selector. We present a summary of conclusions of our study and provide future research directions in the last chapter." @default.
- W194329854 created "2016-06-24" @default.
- W194329854 creator A5018700487 @default.
- W194329854 creator A5037072174 @default.
- W194329854 date "2011-01-01" @default.
- W194329854 modified "2023-09-26" @default.
- W194329854 title "Tractable relaxations and efficient algorithmic techniques for large-scale optimization" @default.
- W194329854 hasPublicationYear "2011" @default.
- W194329854 type Work @default.
- W194329854 sameAs 194329854 @default.
- W194329854 citedByCount "0" @default.
- W194329854 crossrefType "dissertation" @default.
- W194329854 hasAuthorship W194329854A5018700487 @default.
- W194329854 hasAuthorship W194329854A5037072174 @default.
- W194329854 hasConcept C101901036 @default.
- W194329854 hasConcept C106487976 @default.
- W194329854 hasConcept C111472728 @default.
- W194329854 hasConcept C112680207 @default.
- W194329854 hasConcept C11413529 @default.
- W194329854 hasConcept C121332964 @default.
- W194329854 hasConcept C124851039 @default.
- W194329854 hasConcept C126255220 @default.
- W194329854 hasConcept C137836250 @default.
- W194329854 hasConcept C138885662 @default.
- W194329854 hasConcept C157972887 @default.
- W194329854 hasConcept C159985019 @default.
- W194329854 hasConcept C163716315 @default.
- W194329854 hasConcept C192562407 @default.
- W194329854 hasConcept C2524010 @default.
- W194329854 hasConcept C2778755073 @default.
- W194329854 hasConcept C33923547 @default.
- W194329854 hasConcept C41008148 @default.
- W194329854 hasConcept C56372850 @default.
- W194329854 hasConcept C62520636 @default.
- W194329854 hasConcept C75553542 @default.
- W194329854 hasConcept C80444323 @default.
- W194329854 hasConceptScore W194329854C101901036 @default.
- W194329854 hasConceptScore W194329854C106487976 @default.
- W194329854 hasConceptScore W194329854C111472728 @default.
- W194329854 hasConceptScore W194329854C112680207 @default.
- W194329854 hasConceptScore W194329854C11413529 @default.
- W194329854 hasConceptScore W194329854C121332964 @default.
- W194329854 hasConceptScore W194329854C124851039 @default.
- W194329854 hasConceptScore W194329854C126255220 @default.
- W194329854 hasConceptScore W194329854C137836250 @default.
- W194329854 hasConceptScore W194329854C138885662 @default.
- W194329854 hasConceptScore W194329854C157972887 @default.
- W194329854 hasConceptScore W194329854C159985019 @default.
- W194329854 hasConceptScore W194329854C163716315 @default.
- W194329854 hasConceptScore W194329854C192562407 @default.
- W194329854 hasConceptScore W194329854C2524010 @default.
- W194329854 hasConceptScore W194329854C2778755073 @default.
- W194329854 hasConceptScore W194329854C33923547 @default.
- W194329854 hasConceptScore W194329854C41008148 @default.
- W194329854 hasConceptScore W194329854C56372850 @default.
- W194329854 hasConceptScore W194329854C62520636 @default.
- W194329854 hasConceptScore W194329854C75553542 @default.
- W194329854 hasConceptScore W194329854C80444323 @default.
- W194329854 hasLocation W1943298541 @default.
- W194329854 hasOpenAccess W194329854 @default.
- W194329854 hasPrimaryLocation W1943298541 @default.
- W194329854 hasRelatedWork W1971361074 @default.
- W194329854 hasRelatedWork W2072100688 @default.
- W194329854 hasRelatedWork W2157728552 @default.
- W194329854 hasRelatedWork W2263077610 @default.
- W194329854 hasRelatedWork W2306960032 @default.
- W194329854 hasRelatedWork W2800287137 @default.
- W194329854 hasRelatedWork W2805088401 @default.
- W194329854 hasRelatedWork W2810541051 @default.
- W194329854 hasRelatedWork W2887105207 @default.
- W194329854 hasRelatedWork W2914057340 @default.
- W194329854 hasRelatedWork W2964326866 @default.
- W194329854 hasRelatedWork W2971215329 @default.
- W194329854 hasRelatedWork W2977966311 @default.
- W194329854 hasRelatedWork W3033352228 @default.
- W194329854 hasRelatedWork W3037849261 @default.
- W194329854 hasRelatedWork W3083116065 @default.
- W194329854 hasRelatedWork W3084755519 @default.
- W194329854 hasRelatedWork W3096668680 @default.
- W194329854 hasRelatedWork W3167700581 @default.
- W194329854 hasRelatedWork W3198584745 @default.
- W194329854 isParatext "false" @default.
- W194329854 isRetracted "false" @default.
- W194329854 magId "194329854" @default.
- W194329854 workType "dissertation" @default.