Matches in SemOpenAlex for { <https://semopenalex.org/work/W2106257764> ?p ?o ?g. }
- W2106257764 abstract "The state-of-the-art approach in visual object recognition is the use of local information extracted at several points or image patches from an image. Local information at specific points can deal with object shape variability and partial occlusions. The underlying idea is that, in different images, the statistical distribution of the patches is different, which can be effectively exploited for recognition. In such a patch-based object recognition system, the key role of a visual codebook is to provide a way to map the low-level features into a fixed-length vector in histogram space to which standard classifiers can be directly applied. The discriminative power of a visual codebook determines the quality of the codebook model, whereas the size of the codebook controls the complexity of the model. Thus, the construction of a codebook plays a central role that affects the model’s complexity. The construction of a codebook is an important step which is usually done by cluster analysis. However, clustering is a process that retains regions of high density in a distribution and it follows that the resulting codebook need not have discriminant properties. This is also recognised as a computational bottleneck of such systems. This thesis demonstrates a novel approach, that we call resource-allocating codebook (RAC), to constructing a discriminant codebook in a one-pass design procedure inspired by the resource-allocation network family of algorithms. The RAC approach slightly outperforms more traditional approaches due to its tendency to spread out the cluster centres over a broader range of the feature space thereby including rare low-level features in the codebook than density-preserving clustering-based codebooks. Our algorithm achieves this performance at drastically reduced computing times, because apart from an initial scan through a small subset to determine length scales, each data item is processed only once. We illustrate some properties of our method and compare it to a closely related approach known as the mean-shift clustering technique. A pruning strategy has been employed to tackle a few outliers when assigning each feature in images to the closest codeword to create a histogram representation for each image. Features whose distance from the closest codeword exceeds an empirical distance maximum are neglected. A recognition system that learns incrementally with training images and the output classifier accounting for class-specific discriminant features is also presented. Furthermore, we address an approach which, instead of clustering, adaptively constructs a codebook by computing Fisher scores between the classes of interest. This thesis also demonstrates a novel sequential hierarchical clustering technique that initially builds a hierarchical tree from a small subset of the data, while the remaining data are processed sequentially and the tree adapted constructively. Evaluations performed with this approach show that the performance is comparable while reducing the computational needs. Finally, during the process of classification, we demonstrate a new learning architecture for multi-class classification tasks using support vector machines. This technique is faster in testing compared to directed acyclic graph (DAG) SVMs, while maintaining comparable performance to the standard multi-class classification techniques." @default.
- W2106257764 created "2016-06-24" @default.
- W2106257764 creator A5074891850 @default.
- W2106257764 date "2010-06-01" @default.
- W2106257764 modified "2023-09-26" @default.
- W2106257764 title "Designing a resource-allocating codebook for patch-based visual object recognition" @default.
- W2106257764 cites W1490760466 @default.
- W2106257764 cites W1507365645 @default.
- W2106257764 cites W1510073064 @default.
- W2106257764 cites W1519789031 @default.
- W2106257764 cites W1545533962 @default.
- W2106257764 cites W1563857457 @default.
- W2106257764 cites W1575243176 @default.
- W2106257764 cites W1590105591 @default.
- W2106257764 cites W1590689398 @default.
- W2106257764 cites W1608462934 @default.
- W2106257764 cites W1618905105 @default.
- W2106257764 cites W1625255723 @default.
- W2106257764 cites W1676552347 @default.
- W2106257764 cites W1677409904 @default.
- W2106257764 cites W1699734612 @default.
- W2106257764 cites W1880262756 @default.
- W2106257764 cites W1965198822 @default.
- W2106257764 cites W1976526581 @default.
- W2106257764 cites W1977556410 @default.
- W2106257764 cites W1984049995 @default.
- W2106257764 cites W1985176850 @default.
- W2106257764 cites W1986560547 @default.
- W2106257764 cites W1992419399 @default.
- W2106257764 cites W1995444699 @default.
- W2106257764 cites W2004931706 @default.
- W2106257764 cites W2009824857 @default.
- W2106257764 cites W2012330712 @default.
- W2106257764 cites W2022463110 @default.
- W2106257764 cites W2022686119 @default.
- W2106257764 cites W2026942141 @default.
- W2106257764 cites W2033012377 @default.
- W2106257764 cites W2042316011 @default.
- W2106257764 cites W2056133372 @default.
- W2106257764 cites W2057828472 @default.
- W2106257764 cites W2066680326 @default.
- W2106257764 cites W2067191022 @default.
- W2106257764 cites W2084812512 @default.
- W2106257764 cites W2092798093 @default.
- W2106257764 cites W2102129392 @default.
- W2106257764 cites W2102544846 @default.
- W2106257764 cites W2103560185 @default.
- W2106257764 cites W2104170135 @default.
- W2106257764 cites W2104207383 @default.
- W2106257764 cites W2104978738 @default.
- W2106257764 cites W2105126798 @default.
- W2106257764 cites W2107034620 @default.
- W2106257764 cites W2109109045 @default.
- W2106257764 cites W2109863423 @default.
- W2106257764 cites W2111308925 @default.
- W2106257764 cites W2112020727 @default.
- W2106257764 cites W2113855951 @default.
- W2106257764 cites W2114865067 @default.
- W2106257764 cites W2117077088 @default.
- W2106257764 cites W2117196912 @default.
- W2106257764 cites W2119747362 @default.
- W2106257764 cites W2119821739 @default.
- W2106257764 cites W2120738995 @default.
- W2106257764 cites W2121627225 @default.
- W2106257764 cites W2122005803 @default.
- W2106257764 cites W2123737232 @default.
- W2106257764 cites W2123921160 @default.
- W2106257764 cites W2124386111 @default.
- W2106257764 cites W2124404372 @default.
- W2106257764 cites W2125055259 @default.
- W2106257764 cites W2125101937 @default.
- W2106257764 cites W2126833203 @default.
- W2106257764 cites W2128017662 @default.
- W2106257764 cites W2128825494 @default.
- W2106257764 cites W2131627887 @default.
- W2106257764 cites W2131846894 @default.
- W2106257764 cites W2133014420 @default.
- W2106257764 cites W2133864802 @default.
- W2106257764 cites W2134380836 @default.
- W2106257764 cites W2135512949 @default.
- W2106257764 cites W2135807661 @default.
- W2106257764 cites W2138963285 @default.
- W2106257764 cites W2139212933 @default.
- W2106257764 cites W2139635760 @default.
- W2106257764 cites W2140047820 @default.
- W2106257764 cites W2141082694 @default.
- W2106257764 cites W2141303268 @default.
- W2106257764 cites W2145023731 @default.
- W2106257764 cites W2145072179 @default.
- W2106257764 cites W2145862222 @default.
- W2106257764 cites W2149684865 @default.
- W2106257764 cites W2150926065 @default.
- W2106257764 cites W2151103935 @default.
- W2106257764 cites W2151259137 @default.
- W2106257764 cites W2151370125 @default.
- W2106257764 cites W2151766597 @default.
- W2106257764 cites W2152923755 @default.
- W2106257764 cites W2153509747 @default.
- W2106257764 cites W2153565331 @default.
- W2106257764 cites W2155231285 @default.