Matches in SemOpenAlex for { <https://semopenalex.org/work/W4211169133> ?p ?o ?g. }
- W4211169133 endingPage "200066" @default.
- W4211169133 startingPage "200066" @default.
- W4211169133 abstract "Breast cancer (BC) classification has become a point of concern within the field of biomedical informatics in the health care sector in recent years. This is because it is the second-largest cause of cancer-related fatalities among women. The medical field has attracted the attention of researchers in applying machine learning techniques to the detection, and monitoring of life-threatening diseases such as breast cancer (BC). Proper detection and monitoring contribute immensely to the survival of BC patients, which is largely dependent on the analysis of pathological images. Automatic detection of BC based on pathological images and the use of a Computer-Aided Diagnosis (CAD) system allow doctors to make a more reliable decision. Recently, Deep Learning algorithms like Convolution Neural Network have been proven to be reliable in detecting BC targets from pathological images. Several research efforts have been undertaken in the binary classification of histopathological images. However, few approaches have been proposed for the multi-classification of histopathological images. The classification accuracy produced by these approaches are inefficient since they considered only texture-based extracted features and they used some techniques that cannot extract some of the main features from the images. Also, these techniques still suffered from the issue of overfitting. In this work, handcrafted feature extraction techniques (Hu moment, Haralick textures, and color histogram) and Deep Neural Network (DNN) are employed for breast cancer multi-classification using histopathological images on the BreakHis dataset. The features extracted using the handcrafted techniques are used to train the DNN classifiers with four dense layers and Softmax. Further, the data augmentation method was employed to address the issue of overfitting. The results obtained reveal that the use of handcrafted approach as feature extractors and DNN classifiers had a better performance in breast cancer multi-classification than other approaches in the literature. Moreover, it was also noted that augmentation of data plays a key role in further improvement of classification accuracy. The proposed method achieved an accuracy score of 97.87% for 40x, 97.60% for 100x, 96.10% for 200x, and 96.84% for 400x for the magnification-dependent histopathological images classification. The results also showed that the proposed method for using the Handcrafted feature extraction method with DNN classifier had a better performance in multi-classification of breast cancer using histopathological images than most of the related works in the literature." @default.
- W4211169133 created "2022-02-13" @default.
- W4211169133 creator A5017405907 @default.
- W4211169133 creator A5041454217 @default.
- W4211169133 creator A5042211413 @default.
- W4211169133 creator A5085562505 @default.
- W4211169133 creator A5090196764 @default.
- W4211169133 date "2022-05-01" @default.
- W4211169133 modified "2023-10-13" @default.
- W4211169133 title "Improved multi-classification of breast cancer histopathological images using handcrafted features and deep neural network (dense layer)" @default.
- W4211169133 cites W1595449537 @default.
- W4211169133 cites W2044465660 @default.
- W4211169133 cites W2089927030 @default.
- W4211169133 cites W2105215500 @default.
- W4211169133 cites W2117510288 @default.
- W4211169133 cites W2151608510 @default.
- W4211169133 cites W2190746225 @default.
- W4211169133 cites W2560028940 @default.
- W4211169133 cites W2565286512 @default.
- W4211169133 cites W2620578070 @default.
- W4211169133 cites W2788072220 @default.
- W4211169133 cites W2789877281 @default.
- W4211169133 cites W2791915981 @default.
- W4211169133 cites W2801370692 @default.
- W4211169133 cites W2907666198 @default.
- W4211169133 cites W2910232415 @default.
- W4211169133 cites W2919115771 @default.
- W4211169133 cites W2921020016 @default.
- W4211169133 cites W2928842276 @default.
- W4211169133 cites W2952319621 @default.
- W4211169133 cites W2952367870 @default.
- W4211169133 cites W2962934138 @default.
- W4211169133 cites W2969326038 @default.
- W4211169133 cites W2991603289 @default.
- W4211169133 cites W2998449599 @default.
- W4211169133 cites W3003199121 @default.
- W4211169133 cites W3011917823 @default.
- W4211169133 cites W3113098418 @default.
- W4211169133 doi "https://doi.org/10.1016/j.iswa.2022.200066" @default.
- W4211169133 hasPublicationYear "2022" @default.
- W4211169133 type Work @default.
- W4211169133 citedByCount "10" @default.
- W4211169133 countsByYear W42111691332022 @default.
- W4211169133 countsByYear W42111691332023 @default.
- W4211169133 crossrefType "journal-article" @default.
- W4211169133 hasAuthorship W4211169133A5017405907 @default.
- W4211169133 hasAuthorship W4211169133A5041454217 @default.
- W4211169133 hasAuthorship W4211169133A5042211413 @default.
- W4211169133 hasAuthorship W4211169133A5085562505 @default.
- W4211169133 hasAuthorship W4211169133A5090196764 @default.
- W4211169133 hasBestOaLocation W42111691331 @default.
- W4211169133 hasConcept C108583219 @default.
- W4211169133 hasConcept C115961682 @default.
- W4211169133 hasConcept C119857082 @default.
- W4211169133 hasConcept C121608353 @default.
- W4211169133 hasConcept C126322002 @default.
- W4211169133 hasConcept C138885662 @default.
- W4211169133 hasConcept C153180895 @default.
- W4211169133 hasConcept C154945302 @default.
- W4211169133 hasConcept C17426736 @default.
- W4211169133 hasConcept C22019652 @default.
- W4211169133 hasConcept C2776401178 @default.
- W4211169133 hasConcept C41008148 @default.
- W4211169133 hasConcept C41895202 @default.
- W4211169133 hasConcept C50644808 @default.
- W4211169133 hasConcept C52622490 @default.
- W4211169133 hasConcept C530470458 @default.
- W4211169133 hasConcept C53533937 @default.
- W4211169133 hasConcept C71924100 @default.
- W4211169133 hasConcept C75294576 @default.
- W4211169133 hasConcept C81363708 @default.
- W4211169133 hasConcept C87335442 @default.
- W4211169133 hasConceptScore W4211169133C108583219 @default.
- W4211169133 hasConceptScore W4211169133C115961682 @default.
- W4211169133 hasConceptScore W4211169133C119857082 @default.
- W4211169133 hasConceptScore W4211169133C121608353 @default.
- W4211169133 hasConceptScore W4211169133C126322002 @default.
- W4211169133 hasConceptScore W4211169133C138885662 @default.
- W4211169133 hasConceptScore W4211169133C153180895 @default.
- W4211169133 hasConceptScore W4211169133C154945302 @default.
- W4211169133 hasConceptScore W4211169133C17426736 @default.
- W4211169133 hasConceptScore W4211169133C22019652 @default.
- W4211169133 hasConceptScore W4211169133C2776401178 @default.
- W4211169133 hasConceptScore W4211169133C41008148 @default.
- W4211169133 hasConceptScore W4211169133C41895202 @default.
- W4211169133 hasConceptScore W4211169133C50644808 @default.
- W4211169133 hasConceptScore W4211169133C52622490 @default.
- W4211169133 hasConceptScore W4211169133C530470458 @default.
- W4211169133 hasConceptScore W4211169133C53533937 @default.
- W4211169133 hasConceptScore W4211169133C71924100 @default.
- W4211169133 hasConceptScore W4211169133C75294576 @default.
- W4211169133 hasConceptScore W4211169133C81363708 @default.
- W4211169133 hasConceptScore W4211169133C87335442 @default.
- W4211169133 hasLocation W42111691331 @default.
- W4211169133 hasLocation W42111691332 @default.
- W4211169133 hasOpenAccess W4211169133 @default.
- W4211169133 hasPrimaryLocation W42111691331 @default.
- W4211169133 hasRelatedWork W1566030314 @default.