Matches in SemOpenAlex for { <https://semopenalex.org/work/W2123358413> ?p ?o ?g. }
- W2123358413 abstract "We study the ordinal ranking problem in machine learning. The problem can be viewed as a classification problem with additional ordinal information or as a regression problem without actual numerical information. From the classification perspective, we formalize the concept of ordinal information by a cost-sensitive setup, and propose some novel cost-sensitive classification algorithms. The algorithms are derived from a systematic cost-transformation technique, which carries a strong theoretical guarantee. Experimental results show that the novel algorithms perform well both in a general cost-sensitive setup and in the specific ordinal ranking setup. From the regression perspective, we propose the threshold ensemble model for ordinal ranking, which allows the machines to estimate a real-valued score (like regression) before quantizing it to an ordinal rank. We study the generalization ability of threshold ensembles and derive novel large-margin bounds on its expected test performance. In addition, we improve an existing algorithm and propose a novel algorithm for constructing large-margin threshold ensembles. Our proposed algorithms are efficient in training and achieve decent out-of-sample performance when compared with the state-of-the-art algorithm on benchmark data sets. We then study how ordinal ranking can be reduced to weighted binary classification. The reduction framework is simpler than the cost-sensitive classification approach and includes the threshold ensemble model as a special case. The framework allows us to derive strong theoretical results that tightly connect ordinal ranking with binary classification. We demonstrate the algorithmic and theoretical use of the reduction framework by extending SVM and AdaBoost, two of the most popular binary classification algorithms, to the area of ordinal ranking. Coupling SVM with the reduction framework results in a novel and faster algorithm for ordinal ranking with superior performance on real-world data sets, as well as a new bound on the expected test performance for generalized linear ordinal rankers. Coupling AdaBoost with the reduction framework leads to a novel algorithm that boosts the training accuracy of any cost-sensitive ordinal ranking algorithms theoretically, and in turn improves their test performance empirically. From the studies above, the key to improve ordinal ranking is to improve binary classification. In the final part of the thesis, we include two projects that aim at understanding binary classification better in the context of ensemble learning. First, we discuss how AdaBoost is restricted to combining only a finite number of hypotheses and remove the restriction by formulating a framework of infinite ensemble learning based on SVM. The framework can output an infinite ensemble through embedding infinitely many hypotheses into an SVM kernel. Using the framework, we show that binary classification (and hence ordinal ranking) can be improved by going from a finite ensemble to an infinite one. Second, we discuss how AdaBoost carries the property of being resistant to overfitting. Then, we propose the SeedBoost algorithm, which uses the property as a machinery to prevent other learning algorithms from overfitting. Empirical results demonstrate that SeedBoost can indeed improve an overfitting algorithm on some data sets." @default.
- W2123358413 created "2016-06-24" @default.
- W2123358413 creator A5008335614 @default.
- W2123358413 creator A5069502162 @default.
- W2123358413 date "2008-01-01" @default.
- W2123358413 modified "2023-09-27" @default.
- W2123358413 title "From ordinal ranking to binary classification" @default.
- W2123358413 cites W1502981577 @default.
- W2123358413 cites W1509722456 @default.
- W2123358413 cites W1512098439 @default.
- W2123358413 cites W1520252399 @default.
- W2123358413 cites W1540007258 @default.
- W2123358413 cites W1549656520 @default.
- W2123358413 cites W1554944419 @default.
- W2123358413 cites W1568733146 @default.
- W2123358413 cites W1570060426 @default.
- W2123358413 cites W1577296257 @default.
- W2123358413 cites W1578080815 @default.
- W2123358413 cites W1604938182 @default.
- W2123358413 cites W1606857374 @default.
- W2123358413 cites W1658064490 @default.
- W2123358413 cites W1734803745 @default.
- W2123358413 cites W1758459992 @default.
- W2123358413 cites W1975846642 @default.
- W2123358413 cites W1980896222 @default.
- W2123358413 cites W1988790447 @default.
- W2123358413 cites W1997855593 @default.
- W2123358413 cites W2019363670 @default.
- W2123358413 cites W2072555316 @default.
- W2123358413 cites W2091233543 @default.
- W2123358413 cites W2098576843 @default.
- W2123358413 cites W2099579348 @default.
- W2123358413 cites W2100659887 @default.
- W2123358413 cites W2103012681 @default.
- W2123358413 cites W2106191340 @default.
- W2123358413 cites W2107890099 @default.
- W2123358413 cites W2109943925 @default.
- W2123358413 cites W2110612061 @default.
- W2123358413 cites W2111049014 @default.
- W2123358413 cites W2112076978 @default.
- W2123358413 cites W2118286367 @default.
- W2123358413 cites W2119073761 @default.
- W2123358413 cites W2120879296 @default.
- W2123358413 cites W2121485093 @default.
- W2123358413 cites W2122168905 @default.
- W2123358413 cites W2124105163 @default.
- W2123358413 cites W2124776405 @default.
- W2123358413 cites W2128186735 @default.
- W2123358413 cites W2129727551 @default.
- W2123358413 cites W2132166479 @default.
- W2123358413 cites W2135125546 @default.
- W2123358413 cites W2142261479 @default.
- W2123358413 cites W2142575165 @default.
- W2123358413 cites W2143386126 @default.
- W2123358413 cites W2145073242 @default.
- W2123358413 cites W2148274426 @default.
- W2123358413 cites W2148603752 @default.
- W2123358413 cites W2149684865 @default.
- W2123358413 cites W214995755 @default.
- W2123358413 cites W2153635508 @default.
- W2123358413 cites W2153694028 @default.
- W2123358413 cites W2155901202 @default.
- W2123358413 cites W2156909104 @default.
- W2123358413 cites W2164939051 @default.
- W2123358413 cites W2172000360 @default.
- W2123358413 cites W2172195373 @default.
- W2123358413 cites W24455790 @default.
- W2123358413 cites W2795929191 @default.
- W2123358413 cites W2912934387 @default.
- W2123358413 cites W2982720039 @default.
- W2123358413 cites W2988119488 @default.
- W2123358413 cites W3023786531 @default.
- W2123358413 cites W95351861 @default.
- W2123358413 cites W2586016702 @default.
- W2123358413 doi "https://doi.org/10.7907/7b0f-e145." @default.
- W2123358413 hasPublicationYear "2008" @default.
- W2123358413 type Work @default.
- W2123358413 sameAs 2123358413 @default.
- W2123358413 citedByCount "7" @default.
- W2123358413 countsByYear W21233584132014 @default.
- W2123358413 countsByYear W21233584132015 @default.
- W2123358413 countsByYear W21233584132016 @default.
- W2123358413 countsByYear W21233584132018 @default.
- W2123358413 countsByYear W21233584132019 @default.
- W2123358413 crossrefType "dissertation" @default.
- W2123358413 hasAuthorship W2123358413A5008335614 @default.
- W2123358413 hasAuthorship W2123358413A5069502162 @default.
- W2123358413 hasConcept C110313322 @default.
- W2123358413 hasConcept C11413529 @default.
- W2123358413 hasConcept C114614502 @default.
- W2123358413 hasConcept C119857082 @default.
- W2123358413 hasConcept C12267149 @default.
- W2123358413 hasConcept C124101348 @default.
- W2123358413 hasConcept C12713177 @default.
- W2123358413 hasConcept C13280743 @default.
- W2123358413 hasConcept C134306372 @default.
- W2123358413 hasConcept C154945302 @default.
- W2123358413 hasConcept C164226766 @default.
- W2123358413 hasConcept C177148314 @default.
- W2123358413 hasConcept C185798385 @default.