Matches in SemOpenAlex for { <https://semopenalex.org/work/W2563092004> ?p ?o ?g. }
Showing items 1 to 89 of
89
with 100 items per page.
- W2563092004 endingPage "37" @default.
- W2563092004 startingPage "27" @default.
- W2563092004 abstract "Classification and numeric estimation are the two most common types of data mining. The goal of classification is to predict the discrete type of output values whereas estimation is aimed at finding the continuous type of output values. Predictive data mining is generally achieved by using only one specific statistical or machine learning technique to construct a prediction model. Related studies have shown that prediction performance by this kind of single flat model can be improved by the utilization of some hierarchical structures. Hierarchical estimation approaches, usually a combination of multiple estimation models, have been proposed for solving some specific domain problems. However, in the literature, there is no generic hierarchical approach for estimation and no hybrid based solution that combines classification and estimation techniques hierarchically. Therefore, we introduce a generic hierarchical architecture, namely hierarchical classification and regression (HCR), suitable for various estimation problems. Simply speaking, the first level of HCR involves pre-processing a given training set by classifying it into k classes, leading to k subsets. Three approaches are used to perform this task in this study: hard classification (HC); fuzzy c-means (FCM); and genetic algorithms (GA). Then, each training data containing its associated class label is used to train a support vector machine (SVM) classifier for classification. Next, for the second level of HCR, k regression (or estimation) models are trained based on their corresponding subsets for final prediction. The experiments based on 8 different UCI datasets show that most hierarchical prediction models developed with the HCR architecture significantly outperform three well-known single flat prediction models, i.e., linear regression (LR), multilayer perceptron (MLP) neural networks, and support vector regression (SVR) in terms of mean absolute percentage error (MAPE) and root mean squared error (RMSE) rates. In addition, it is found that using the GA-based data pre-processing approach to classify the training set into 4 subsets is the best threshold (i.e., k=4) and the 4-class SVM+MLP outperforms three baseline hierarchical regression models." @default.
- W2563092004 created "2017-01-06" @default.
- W2563092004 creator A5010914146 @default.
- W2563092004 creator A5045234607 @default.
- W2563092004 creator A5055777557 @default.
- W2563092004 creator A5073328202 @default.
- W2563092004 date "2017-04-01" @default.
- W2563092004 modified "2023-10-02" @default.
- W2563092004 title "Soft estimation by hierarchical classification and regression" @default.
- W2563092004 cites W1181348626 @default.
- W2563092004 cites W136895179 @default.
- W2563092004 cites W1483520980 @default.
- W2563092004 cites W1552681844 @default.
- W2563092004 cites W1964494592 @default.
- W2563092004 cites W1970968805 @default.
- W2563092004 cites W1976688431 @default.
- W2563092004 cites W1977556410 @default.
- W2563092004 cites W1987281309 @default.
- W2563092004 cites W1990140423 @default.
- W2563092004 cites W1992419399 @default.
- W2563092004 cites W1994708707 @default.
- W2563092004 cites W2035864983 @default.
- W2563092004 cites W2037734629 @default.
- W2563092004 cites W2039148220 @default.
- W2563092004 cites W2040647667 @default.
- W2563092004 cites W2052055672 @default.
- W2563092004 cites W2063862666 @default.
- W2563092004 cites W2066793765 @default.
- W2563092004 cites W2080476941 @default.
- W2563092004 cites W2095512713 @default.
- W2563092004 cites W2101641917 @default.
- W2563092004 cites W2102734279 @default.
- W2563092004 cites W2147434872 @default.
- W2563092004 cites W2209090757 @default.
- W2563092004 doi "https://doi.org/10.1016/j.neucom.2016.12.037" @default.
- W2563092004 hasPublicationYear "2017" @default.
- W2563092004 type Work @default.
- W2563092004 sameAs 2563092004 @default.
- W2563092004 citedByCount "8" @default.
- W2563092004 countsByYear W25630920042019 @default.
- W2563092004 countsByYear W25630920042020 @default.
- W2563092004 countsByYear W25630920042022 @default.
- W2563092004 crossrefType "journal-article" @default.
- W2563092004 hasAuthorship W2563092004A5010914146 @default.
- W2563092004 hasAuthorship W2563092004A5045234607 @default.
- W2563092004 hasAuthorship W2563092004A5055777557 @default.
- W2563092004 hasAuthorship W2563092004A5073328202 @default.
- W2563092004 hasConcept C105795698 @default.
- W2563092004 hasConcept C119857082 @default.
- W2563092004 hasConcept C152877465 @default.
- W2563092004 hasConcept C153180895 @default.
- W2563092004 hasConcept C154945302 @default.
- W2563092004 hasConcept C162324750 @default.
- W2563092004 hasConcept C187736073 @default.
- W2563092004 hasConcept C33923547 @default.
- W2563092004 hasConcept C41008148 @default.
- W2563092004 hasConcept C83546350 @default.
- W2563092004 hasConcept C96250715 @default.
- W2563092004 hasConceptScore W2563092004C105795698 @default.
- W2563092004 hasConceptScore W2563092004C119857082 @default.
- W2563092004 hasConceptScore W2563092004C152877465 @default.
- W2563092004 hasConceptScore W2563092004C153180895 @default.
- W2563092004 hasConceptScore W2563092004C154945302 @default.
- W2563092004 hasConceptScore W2563092004C162324750 @default.
- W2563092004 hasConceptScore W2563092004C187736073 @default.
- W2563092004 hasConceptScore W2563092004C33923547 @default.
- W2563092004 hasConceptScore W2563092004C41008148 @default.
- W2563092004 hasConceptScore W2563092004C83546350 @default.
- W2563092004 hasConceptScore W2563092004C96250715 @default.
- W2563092004 hasLocation W25630920041 @default.
- W2563092004 hasOpenAccess W2563092004 @default.
- W2563092004 hasPrimaryLocation W25630920041 @default.
- W2563092004 hasRelatedWork W1980588930 @default.
- W2563092004 hasRelatedWork W2060912888 @default.
- W2563092004 hasRelatedWork W2062105804 @default.
- W2563092004 hasRelatedWork W2080727847 @default.
- W2563092004 hasRelatedWork W2094419952 @default.
- W2563092004 hasRelatedWork W2119696881 @default.
- W2563092004 hasRelatedWork W2374407646 @default.
- W2563092004 hasRelatedWork W4255213289 @default.
- W2563092004 hasRelatedWork W4290879003 @default.
- W2563092004 hasRelatedWork W2738033194 @default.
- W2563092004 hasVolume "234" @default.
- W2563092004 isParatext "false" @default.
- W2563092004 isRetracted "false" @default.
- W2563092004 magId "2563092004" @default.
- W2563092004 workType "article" @default.