Matches in SemOpenAlex for { <https://semopenalex.org/work/W4387246590> ?p ?o ?g. }
Showing items 1 to 87 of
87
with 100 items per page.
- W4387246590 endingPage "104229" @default.
- W4387246590 startingPage "104229" @default.
- W4387246590 abstract "Convolutional neural networks (CNNs) have been widely used in medical imaging applications, including brain diseases such as Alzheimer's disease (AD) classification based on neuroimaging data. Researchers extract the potential brain regions related to AD disease using CNN from various imaging modalities due to its architectural inductive bias. The major limitation of the current CNN-based model is that it doesn't capture long-range relationships and long-distance correlation within the image features. Vision transformers (ViT) have proven an astounding performance in encoding long-range relationships with strong modeling capacity and global feature extraction due to the self attention mechanism. However, ViT doesn't model the spatial information or the local features within the image and is hard to train. Researchers have demonstrated that combining CNN and a transformer yields outstanding results. In this study, two new methods are proposed for Alzheimer's disease diagnosis. The first method combines the Swin transformer with an enhanced EfficientNet with multi-head attention and a Depthwise Over-Parameterized Convolutional Layer (DO-Conv). The second method consists of modifying the CoAtNet network with ECA-Net and fused inverted residuals blocks. We evaluated the effectiveness of our proposed methods based on the Open Access Series of Imaging Studies (OASIS) and the Alzheimer's Disease Neuroimaging Initiative (ADNI). Further, we evaluated the proposed methods using the Gradient-based Localization (Grad-CAM) method. The first method achieved 93.23% accuracy of classification on the OASIS dataset. The second method achieved 97.33% accuracy of classification on the OASIS dataset. We applied different multimodal image fusion methods (MRI and PET,MRI and CT) using our proposed method. The experimental results demonstrate that the fusion method based on PET and MRI outperforms the fusion method based on MRI and CT achieving 99.42% accuracy. Our methods outperforms some traditional CNN models and the recent methods that are based on transformer for AD classification." @default.
- W4387246590 created "2023-10-03" @default.
- W4387246590 creator A5017277545 @default.
- W4387246590 creator A5040229768 @default.
- W4387246590 creator A5040720506 @default.
- W4387246590 creator A5059356428 @default.
- W4387246590 date "2023-11-01" @default.
- W4387246590 modified "2023-10-17" @default.
- W4387246590 title "Efficient Multimodel method based on transformers and CoAtNet for Alzheimer's diagnosis" @default.
- W4387246590 cites W3036708415 @default.
- W4387246590 cites W3131390313 @default.
- W4387246590 cites W3158631207 @default.
- W4387246590 cites W3170129735 @default.
- W4387246590 cites W3196105426 @default.
- W4387246590 cites W4200221062 @default.
- W4387246590 cites W4211260092 @default.
- W4387246590 cites W4220739453 @default.
- W4387246590 cites W4220867605 @default.
- W4387246590 cites W4221022447 @default.
- W4387246590 cites W4225898646 @default.
- W4387246590 cites W4281767565 @default.
- W4387246590 cites W4283029781 @default.
- W4387246590 cites W4283163889 @default.
- W4387246590 cites W4291301149 @default.
- W4387246590 cites W4311769512 @default.
- W4387246590 cites W4323923911 @default.
- W4387246590 cites W4361018547 @default.
- W4387246590 cites W4377234261 @default.
- W4387246590 cites W4381435789 @default.
- W4387246590 doi "https://doi.org/10.1016/j.dsp.2023.104229" @default.
- W4387246590 hasPublicationYear "2023" @default.
- W4387246590 type Work @default.
- W4387246590 citedByCount "0" @default.
- W4387246590 crossrefType "journal-article" @default.
- W4387246590 hasAuthorship W4387246590A5017277545 @default.
- W4387246590 hasAuthorship W4387246590A5040229768 @default.
- W4387246590 hasAuthorship W4387246590A5040720506 @default.
- W4387246590 hasAuthorship W4387246590A5059356428 @default.
- W4387246590 hasConcept C108583219 @default.
- W4387246590 hasConcept C11413529 @default.
- W4387246590 hasConcept C119857082 @default.
- W4387246590 hasConcept C153180895 @default.
- W4387246590 hasConcept C154945302 @default.
- W4387246590 hasConcept C165464430 @default.
- W4387246590 hasConcept C169760540 @default.
- W4387246590 hasConcept C169900460 @default.
- W4387246590 hasConcept C2778373026 @default.
- W4387246590 hasConcept C2984915365 @default.
- W4387246590 hasConcept C41008148 @default.
- W4387246590 hasConcept C52622490 @default.
- W4387246590 hasConcept C58693492 @default.
- W4387246590 hasConcept C81363708 @default.
- W4387246590 hasConcept C86803240 @default.
- W4387246590 hasConceptScore W4387246590C108583219 @default.
- W4387246590 hasConceptScore W4387246590C11413529 @default.
- W4387246590 hasConceptScore W4387246590C119857082 @default.
- W4387246590 hasConceptScore W4387246590C153180895 @default.
- W4387246590 hasConceptScore W4387246590C154945302 @default.
- W4387246590 hasConceptScore W4387246590C165464430 @default.
- W4387246590 hasConceptScore W4387246590C169760540 @default.
- W4387246590 hasConceptScore W4387246590C169900460 @default.
- W4387246590 hasConceptScore W4387246590C2778373026 @default.
- W4387246590 hasConceptScore W4387246590C2984915365 @default.
- W4387246590 hasConceptScore W4387246590C41008148 @default.
- W4387246590 hasConceptScore W4387246590C52622490 @default.
- W4387246590 hasConceptScore W4387246590C58693492 @default.
- W4387246590 hasConceptScore W4387246590C81363708 @default.
- W4387246590 hasConceptScore W4387246590C86803240 @default.
- W4387246590 hasLocation W43872465901 @default.
- W4387246590 hasOpenAccess W4387246590 @default.
- W4387246590 hasPrimaryLocation W43872465901 @default.
- W4387246590 hasRelatedWork W1943633242 @default.
- W4387246590 hasRelatedWork W2013450123 @default.
- W4387246590 hasRelatedWork W2016777699 @default.
- W4387246590 hasRelatedWork W2031131415 @default.
- W4387246590 hasRelatedWork W2155513557 @default.
- W4387246590 hasRelatedWork W2401460348 @default.
- W4387246590 hasRelatedWork W2891300059 @default.
- W4387246590 hasRelatedWork W2972825155 @default.
- W4387246590 hasRelatedWork W2980858150 @default.
- W4387246590 hasRelatedWork W3112573618 @default.
- W4387246590 hasVolume "143" @default.
- W4387246590 isParatext "false" @default.
- W4387246590 isRetracted "false" @default.
- W4387246590 workType "article" @default.