Matches in SemOpenAlex for { <https://semopenalex.org/work/W4224254448> ?p ?o ?g. }
- W4224254448 endingPage "3715" @default.
- W4224254448 startingPage "3715" @default.
- W4224254448 abstract "The diagnosis and surgical resection using Magnetic Resonance (MR) images in brain tumors is a challenging task to minimize the neurological defects after surgery owing to the non-linear nature of the size, shape, and textural variation. Radiologists, clinical experts, and brain surgeons examine brain MRI scans using the available methods, which are tedious, error-prone, time-consuming, and still exhibit positional accuracy up to 2–3 mm, which is very high in the case of brain cells. In this context, we propose an automated Ultra-Light Brain Tumor Detection (UL-BTD) system based on a novel Ultra-Light Deep Learning Architecture (UL-DLA) for deep features, integrated with highly distinctive textural features, extracted by Gray Level Co-occurrence Matrix (GLCM). It forms a Hybrid Feature Space (HFS), which is used for tumor detection using Support Vector Machine (SVM), culminating in high prediction accuracy and optimum false negatives with limited network size to fit within the average GPU resources of a modern PC system. The objective of this study is to categorize multi-class publicly available MRI brain tumor datasets with a minimum time thus real-time tumor detection can be carried out without compromising accuracy. Our proposed framework includes a sensitivity analysis of image size, One-versus-All and One-versus-One coding schemes with stringent efforts to assess the complexity and reliability performance of the proposed system with K-fold cross-validation as a part of the evaluation protocol. The best generalization achieved using SVM has an average detection rate of 99.23% (99.18%, 98.86%, and 99.67%), and F-measure of 0.99 (0.99, 0.98, and 0.99) for (glioma, meningioma, and pituitary tumors), respectively. Our results have been found to improve the state-of-the-art (97.30%) by 2%, indicating that the system exhibits capability for translation in modern hospitals during real-time surgical brain applications. The method needs 11.69 ms with an accuracy of 99.23% compared to 15 ms achieved by the state-of-the-art to earlier to detect tumors on a test image without any dedicated hardware providing a route for a desktop application in brain surgery." @default.
- W4224254448 created "2022-04-26" @default.
- W4224254448 creator A5033815044 @default.
- W4224254448 creator A5043871968 @default.
- W4224254448 creator A5060644143 @default.
- W4224254448 creator A5064562623 @default.
- W4224254448 creator A5081215670 @default.
- W4224254448 creator A5089583769 @default.
- W4224254448 creator A5091747108 @default.
- W4224254448 date "2022-04-07" @default.
- W4224254448 modified "2023-10-17" @default.
- W4224254448 title "Intelligent Ultra-Light Deep Learning Model for Multi-Class Brain Tumor Detection" @default.
- W4224254448 cites W2047033241 @default.
- W4224254448 cites W2056753605 @default.
- W4224254448 cites W2105457438 @default.
- W4224254448 cites W2117539524 @default.
- W4224254448 cites W2137664016 @default.
- W4224254448 cites W2182098131 @default.
- W4224254448 cites W2240084039 @default.
- W4224254448 cites W2324716350 @default.
- W4224254448 cites W2366536035 @default.
- W4224254448 cites W2403729827 @default.
- W4224254448 cites W2515693519 @default.
- W4224254448 cites W2521587260 @default.
- W4224254448 cites W2543050677 @default.
- W4224254448 cites W2592929672 @default.
- W4224254448 cites W2594373254 @default.
- W4224254448 cites W2782833026 @default.
- W4224254448 cites W2783084701 @default.
- W4224254448 cites W2788703804 @default.
- W4224254448 cites W2790466491 @default.
- W4224254448 cites W2800880748 @default.
- W4224254448 cites W2884478565 @default.
- W4224254448 cites W2897188827 @default.
- W4224254448 cites W2905017682 @default.
- W4224254448 cites W2910541852 @default.
- W4224254448 cites W2919115771 @default.
- W4224254448 cites W2921483513 @default.
- W4224254448 cites W2921743319 @default.
- W4224254448 cites W2924939183 @default.
- W4224254448 cites W2940773812 @default.
- W4224254448 cites W2945839551 @default.
- W4224254448 cites W2947104907 @default.
- W4224254448 cites W2947735999 @default.
- W4224254448 cites W2972838422 @default.
- W4224254448 cites W2988053426 @default.
- W4224254448 cites W2998401461 @default.
- W4224254448 cites W3011430986 @default.
- W4224254448 cites W3013952416 @default.
- W4224254448 cites W3023645955 @default.
- W4224254448 cites W3038588235 @default.
- W4224254448 cites W3071727253 @default.
- W4224254448 cites W3112600384 @default.
- W4224254448 cites W3113649568 @default.
- W4224254448 cites W3127167602 @default.
- W4224254448 cites W3135185854 @default.
- W4224254448 cites W3153436803 @default.
- W4224254448 cites W3168236164 @default.
- W4224254448 cites W3175537936 @default.
- W4224254448 cites W3185052070 @default.
- W4224254448 cites W3189411510 @default.
- W4224254448 cites W4250308098 @default.
- W4224254448 doi "https://doi.org/10.3390/app12083715" @default.
- W4224254448 hasPublicationYear "2022" @default.
- W4224254448 type Work @default.
- W4224254448 citedByCount "30" @default.
- W4224254448 countsByYear W42242544482022 @default.
- W4224254448 countsByYear W42242544482023 @default.
- W4224254448 crossrefType "journal-article" @default.
- W4224254448 hasAuthorship W4224254448A5033815044 @default.
- W4224254448 hasAuthorship W4224254448A5043871968 @default.
- W4224254448 hasAuthorship W4224254448A5060644143 @default.
- W4224254448 hasAuthorship W4224254448A5064562623 @default.
- W4224254448 hasAuthorship W4224254448A5081215670 @default.
- W4224254448 hasAuthorship W4224254448A5089583769 @default.
- W4224254448 hasAuthorship W4224254448A5091747108 @default.
- W4224254448 hasBestOaLocation W42242544481 @default.
- W4224254448 hasConcept C108583219 @default.
- W4224254448 hasConcept C119857082 @default.
- W4224254448 hasConcept C12267149 @default.
- W4224254448 hasConcept C153180895 @default.
- W4224254448 hasConcept C154945302 @default.
- W4224254448 hasConcept C41008148 @default.
- W4224254448 hasConceptScore W4224254448C108583219 @default.
- W4224254448 hasConceptScore W4224254448C119857082 @default.
- W4224254448 hasConceptScore W4224254448C12267149 @default.
- W4224254448 hasConceptScore W4224254448C153180895 @default.
- W4224254448 hasConceptScore W4224254448C154945302 @default.
- W4224254448 hasConceptScore W4224254448C41008148 @default.
- W4224254448 hasFunder F4320322484 @default.
- W4224254448 hasFunder F4320324047 @default.
- W4224254448 hasFunder F4320324433 @default.
- W4224254448 hasIssue "8" @default.
- W4224254448 hasLocation W42242544481 @default.
- W4224254448 hasOpenAccess W4224254448 @default.
- W4224254448 hasPrimaryLocation W42242544481 @default.
- W4224254448 hasRelatedWork W2731899572 @default.
- W4224254448 hasRelatedWork W2939353110 @default.