Matches in SemOpenAlex for { <https://semopenalex.org/work/W2339444885> ?p ?o ?g. }
- W2339444885 endingPage "131" @default.
- W2339444885 startingPage "131" @default.
- W2339444885 abstract "Object detection and segmentation algorithms need to use prior knowledge of objects' shape and appearance to guide solutions to correct ones. A promising way of obtaining prior knowledge is to learn it directly from expert annotations by using machine learning techniques. Previous approaches commonly use generative learning approaches to achieve this goal. In this dissertation, I propose a series of discriminative learning algorithms based on boosting principles to learn prior knowledge from image databases with expert annotations. The learned knowledge improves the performance of detection and segmentation, leading to fast and accurate solutions.For object detection, I present a learning procedure called a Probabilistic Boosting Network (PBN) suitable for real-time object detection and pose estimation. Based on the law of total probability, PBN integrates evidence from two building blocks, namely a multiclass classifier for pose estimation and a detection cascade for object detection. Both the classifier and detection cascade employ boosting. By inferring the pose parameter, I avoid the exhaustive scan over pose parameters, which hampers real-time detection. I implement PBN using a graph-structured network that alternates the two tasks of object detection and pose estimation in an effort to reject negative cases as quickly as possible. Compared with previous approaches, PBN has higher accuracy in object localization and pose estimation with noticeable reduced computation.For object segmentation, I cast deformable object segmentation as optimizing the conditional probability density function p(C mI), where I is an image and C is a vector of model parameters describing the object shape. I propose a regression approach to learn the density p( CmI) discriminatively based on boosting principles. The learned density p(CmI) possesses a desired unimodal, smooth shape, which can be used by optimization algorithms to efficiently estimate a solution. To handle the high-dimensional learning challenges, I propose a multi-level approach and a gradient-based sampling strategy to learn regression functions efficiently. I show that the regression approach consistently outperforms state-of-the-art methods on a variety of testing datasets.Finally, I present a comparative study on how to apply three discriminative learning approaches - classification, regression, and ranking - to deformable shape segmentation. I discuss how to extend the idea of the regression approach to build discriminative models using classification and ranking. I propose sampling strategies to collect training examples from a high-dimensional model space for the classification and the ranking approach. I also propose a ranking algorithm based on Rankboost to learn a discriminative model for segmentation. Experimental results on left ventricle and left atrium segmentation from ultrasound images and facial feature localization demonstrate that the discriminative models outperform generative models and energy minimization methods by a large margin." @default.
- W2339444885 created "2016-06-24" @default.
- W2339444885 creator A5042463953 @default.
- W2339444885 creator A5053755600 @default.
- W2339444885 date "2009-01-01" @default.
- W2339444885 modified "2023-09-27" @default.
- W2339444885 title "Object detection and segmentation using discriminative learning" @default.
- W2339444885 cites W1480865305 @default.
- W2339444885 cites W1490760466 @default.
- W2339444885 cites W1532602393 @default.
- W2339444885 cites W1536929369 @default.
- W2339444885 cites W1540878153 @default.
- W2339444885 cites W1554944419 @default.
- W2339444885 cites W1563961677 @default.
- W2339444885 cites W1567885833 @default.
- W2339444885 cites W1569432948 @default.
- W2339444885 cites W1570592793 @default.
- W2339444885 cites W1578080815 @default.
- W2339444885 cites W1591385104 @default.
- W2339444885 cites W1625504505 @default.
- W2339444885 cites W1678356000 @default.
- W2339444885 cites W1680579736 @default.
- W2339444885 cites W1835499858 @default.
- W2339444885 cites W1931284523 @default.
- W2339444885 cites W1966280301 @default.
- W2339444885 cites W1973948212 @default.
- W2339444885 cites W1975846642 @default.
- W2339444885 cites W1991113069 @default.
- W2339444885 cites W1991605728 @default.
- W2339444885 cites W1992825118 @default.
- W2339444885 cites W2001619934 @default.
- W2339444885 cites W2016133707 @default.
- W2339444885 cites W202303397 @default.
- W2339444885 cites W2024046085 @default.
- W2339444885 cites W2038952578 @default.
- W2339444885 cites W2041398617 @default.
- W2339444885 cites W2049633694 @default.
- W2339444885 cites W2056760934 @default.
- W2339444885 cites W2067885219 @default.
- W2339444885 cites W2093717447 @default.
- W2339444885 cites W2098862607 @default.
- W2339444885 cites W2099530880 @default.
- W2339444885 cites W2099741732 @default.
- W2339444885 cites W2104095591 @default.
- W2339444885 cites W2104364170 @default.
- W2339444885 cites W2107890099 @default.
- W2339444885 cites W2114171253 @default.
- W2339444885 cites W2114758632 @default.
- W2339444885 cites W2116843914 @default.
- W2339444885 cites W2118386984 @default.
- W2339444885 cites W2120995095 @default.
- W2339444885 cites W2121601095 @default.
- W2339444885 cites W2121782097 @default.
- W2339444885 cites W2122246689 @default.
- W2339444885 cites W2124351082 @default.
- W2339444885 cites W2124386111 @default.
- W2339444885 cites W2125949583 @default.
- W2339444885 cites W2126208400 @default.
- W2339444885 cites W2128907102 @default.
- W2339444885 cites W2130771648 @default.
- W2339444885 cites W2130799764 @default.
- W2339444885 cites W2133366724 @default.
- W2339444885 cites W2135086073 @default.
- W2339444885 cites W2136767008 @default.
- W2339444885 cites W2138611914 @default.
- W2339444885 cites W2140274257 @default.
- W2339444885 cites W2142471335 @default.
- W2339444885 cites W2143425433 @default.
- W2339444885 cites W2145023731 @default.
- W2339444885 cites W2146514558 @default.
- W2339444885 cites W2147880316 @default.
- W2339444885 cites W2148603752 @default.
- W2339444885 cites W2150652745 @default.
- W2339444885 cites W2152826865 @default.
- W2339444885 cites W2153939756 @default.
- W2339444885 cites W2154834035 @default.
- W2339444885 cites W2154952480 @default.
- W2339444885 cites W2156175024 @default.
- W2339444885 cites W2156312950 @default.
- W2339444885 cites W2156909104 @default.
- W2339444885 cites W2157149292 @default.
- W2339444885 cites W2157176527 @default.
- W2339444885 cites W2158837149 @default.
- W2339444885 cites W2163176424 @default.
- W2339444885 cites W2163363072 @default.
- W2339444885 cites W2163614729 @default.
- W2339444885 cites W2164598857 @default.
- W2339444885 cites W2166713160 @default.
- W2339444885 cites W2167825066 @default.
- W2339444885 cites W2168572310 @default.
- W2339444885 cites W2169146573 @default.
- W2339444885 cites W2171188998 @default.
- W2339444885 cites W2171849160 @default.
- W2339444885 cites W2217896605 @default.
- W2339444885 cites W2254058419 @default.
- W2339444885 cites W2294798173 @default.
- W2339444885 cites W2739698496 @default.
- W2339444885 cites W2994340921 @default.