Matches in SemOpenAlex for { <https://semopenalex.org/work/W4302425268> ?p ?o ?g. }
- W4302425268 endingPage "200130" @default.
- W4302425268 startingPage "200130" @default.
- W4302425268 abstract "• Proposing a CNN based deep learning system to diagnosis Covid-19 from Chest X-Ray images. The aim of this paper is to help radiologists and medical experts to identify Covid-19 and pneumonia from X-Rays which is a critical task for human being and it could be overlooked as both the disease have almost similar pixel features on X-Ray. Besides this, there is a shortage of testing kits for RT-PCR test and it takes time to identify the infection status. To overcome this, we try to design this system using deep learning as it computer aided system takes less time to make decisions and thus helps to hold up the spreading of the virus. • To conduct our research, first, we collected X-Ray images from different publicly available dataset. To make our system robust, we done some significant pre-processing on the dataset including resizing, normalization, augmentation etc. From the gathered data, we made two dataset: Dataset-1 contains normal, pneumonia and Covid-19 cases and Dataset-2 contains Covid-19 and pneumonia cases. Finally, we run our model using both datsets. • We measured the outcomes by using different statistical evaluation criteria: Accuracy, Specificity, Precision, Sensitivity and F-1 score and found that our proposed system performed well for three class classification. For three class classification, our system achieved 98.5% accuracy. In addition, the sated system is very efficient to differentiate Covid-19 patients from pneumonia as well with an accuracy of 99.6%. Finally, we applied the Gradient Class Activation Map techniques to visualize the area that the deep learning system took into account for the classification. We hope that our proposed system could be helpful for the radiologist to make clear and consistent decision for Covid-19 cases. In recent years, coronavirus (Covid-19) has evolved into one of the world’s leading life-threatening severe viral illnesses. A self-executing accord system might be a better option to stop Covid-19 from spreading due to its quick diagnostic option. Many researches have already investigated various deep learning techniques, which have a significant impact on the quick and precise early detection of Covid-19. Most of the existing techniques, though, have not been trained and tested using a significant amount of data. In this paper, we purpose a deep learning technique enabled Convolutional Neural Network (CNN) to automatically diagnose Covid-19 from chest x-rays. To train and test our model, 10,293 x-rays, including 2875 x-rays of Covid-19, were collected as a data set. The applied dataset consists of three groups of chest x-rays: Covid-19, pneumonia, and normal patients. The proposed approach achieved 98.5% accuracy, 98.9% specificity, 99.2% sensitivity, 99.2% precision, and 98.3% F1-score. Distinguishing Covid-19 patients from pneumonia patients using chest x-ray, particularly for human eyes is crucial since both diseases have nearly identical characteristics. To address this issue, we have categorized Covid-19 and pneumonia using x-rays, achieving a 99.60% accuracy rate. Our findings show that the proposed model might aid clinicians and researchers in rapidly detecting Covid-19 patients, hence facilitating the treatment of Covid-19 patients." @default.
- W4302425268 created "2022-10-06" @default.
- W4302425268 creator A5027893613 @default.
- W4302425268 creator A5034698802 @default.
- W4302425268 creator A5056925301 @default.
- W4302425268 creator A5065107822 @default.
- W4302425268 date "2022-11-01" @default.
- W4302425268 modified "2023-09-30" @default.
- W4302425268 title "Deep viewing for the identification of Covid-19 infection status from chest X-Ray image using CNN based architecture" @default.
- W4302425268 cites W2088059023 @default.
- W4302425268 cites W2802159733 @default.
- W4302425268 cites W2939788146 @default.
- W4302425268 cites W3001195213 @default.
- W4302425268 cites W3005031775 @default.
- W4302425268 cites W3006139879 @default.
- W4302425268 cites W3006645647 @default.
- W4302425268 cites W3009332494 @default.
- W4302425268 cites W3016488464 @default.
- W4302425268 cites W3017855299 @default.
- W4302425268 cites W3019531985 @default.
- W4302425268 cites W3028427008 @default.
- W4302425268 cites W3030621456 @default.
- W4302425268 cites W3033616466 @default.
- W4302425268 cites W3033847194 @default.
- W4302425268 cites W3088680016 @default.
- W4302425268 cites W3089168916 @default.
- W4302425268 cites W3096918659 @default.
- W4302425268 cites W3099905444 @default.
- W4302425268 cites W3101606529 @default.
- W4302425268 cites W3103635657 @default.
- W4302425268 cites W3104004606 @default.
- W4302425268 cites W3107979957 @default.
- W4302425268 cites W3135057764 @default.
- W4302425268 cites W3140022118 @default.
- W4302425268 cites W3160415184 @default.
- W4302425268 cites W3162351260 @default.
- W4302425268 cites W3170020805 @default.
- W4302425268 cites W3182305291 @default.
- W4302425268 cites W3200268767 @default.
- W4302425268 cites W3202799525 @default.
- W4302425268 cites W3212240055 @default.
- W4302425268 cites W4210347885 @default.
- W4302425268 cites W4285248638 @default.
- W4302425268 doi "https://doi.org/10.1016/j.iswa.2022.200130" @default.
- W4302425268 hasPublicationYear "2022" @default.
- W4302425268 type Work @default.
- W4302425268 citedByCount "6" @default.
- W4302425268 countsByYear W43024252682022 @default.
- W4302425268 countsByYear W43024252682023 @default.
- W4302425268 crossrefType "journal-article" @default.
- W4302425268 hasAuthorship W4302425268A5027893613 @default.
- W4302425268 hasAuthorship W4302425268A5034698802 @default.
- W4302425268 hasAuthorship W4302425268A5056925301 @default.
- W4302425268 hasAuthorship W4302425268A5065107822 @default.
- W4302425268 hasBestOaLocation W43024252681 @default.
- W4302425268 hasConcept C116834253 @default.
- W4302425268 hasConcept C123657996 @default.
- W4302425268 hasConcept C142362112 @default.
- W4302425268 hasConcept C142724271 @default.
- W4302425268 hasConcept C153349607 @default.
- W4302425268 hasConcept C154945302 @default.
- W4302425268 hasConcept C18903297 @default.
- W4302425268 hasConcept C2779134260 @default.
- W4302425268 hasConcept C3007834351 @default.
- W4302425268 hasConcept C3008058167 @default.
- W4302425268 hasConcept C31972630 @default.
- W4302425268 hasConcept C41008148 @default.
- W4302425268 hasConcept C524204448 @default.
- W4302425268 hasConcept C71924100 @default.
- W4302425268 hasConcept C86803240 @default.
- W4302425268 hasConceptScore W4302425268C116834253 @default.
- W4302425268 hasConceptScore W4302425268C123657996 @default.
- W4302425268 hasConceptScore W4302425268C142362112 @default.
- W4302425268 hasConceptScore W4302425268C142724271 @default.
- W4302425268 hasConceptScore W4302425268C153349607 @default.
- W4302425268 hasConceptScore W4302425268C154945302 @default.
- W4302425268 hasConceptScore W4302425268C18903297 @default.
- W4302425268 hasConceptScore W4302425268C2779134260 @default.
- W4302425268 hasConceptScore W4302425268C3007834351 @default.
- W4302425268 hasConceptScore W4302425268C3008058167 @default.
- W4302425268 hasConceptScore W4302425268C31972630 @default.
- W4302425268 hasConceptScore W4302425268C41008148 @default.
- W4302425268 hasConceptScore W4302425268C524204448 @default.
- W4302425268 hasConceptScore W4302425268C71924100 @default.
- W4302425268 hasConceptScore W4302425268C86803240 @default.
- W4302425268 hasLocation W43024252681 @default.
- W4302425268 hasLocation W43024252682 @default.
- W4302425268 hasOpenAccess W4302425268 @default.
- W4302425268 hasPrimaryLocation W43024252681 @default.
- W4302425268 hasRelatedWork W3113664224 @default.
- W4302425268 hasRelatedWork W3176864053 @default.
- W4302425268 hasRelatedWork W3198183218 @default.
- W4302425268 hasRelatedWork W4205317059 @default.
- W4302425268 hasRelatedWork W4205810683 @default.
- W4302425268 hasRelatedWork W4206548596 @default.
- W4302425268 hasRelatedWork W4206651655 @default.
- W4302425268 hasRelatedWork W4206669628 @default.
- W4302425268 hasRelatedWork W4224279380 @default.