Matches in SemOpenAlex for { <https://semopenalex.org/work/W4387531603> ?p ?o ?g. }
Showing items 1 to 82 of
82
with 100 items per page.
- W4387531603 endingPage "205" @default.
- W4387531603 startingPage "196" @default.
- W4387531603 abstract "Functional Magnetic Resonance Imaging (fMRI) is a noninvasive neuroimaging technique widely used for research purposes. Appliation of fMRI for medical purposes is still very limited inspite of considerable potential for offering valuable prognostic and differential diagnostic information. One of the problems limiting the use of fMRI in medical settings is that fMRI data is represented as a four-dimensional array of information, and diagnostics relies on the methods employed for data processing only while visual analysis of raw data is impossible. Thus further development of the use of fMRI in clinical practice directly depends on the effectiveness and reliability of the data processing methods used. Resting-state is the main way of scanning in clinical neuroimaging. Resting-state fMRI (RS-fMRI) data can be collected under three conditions: eyes closed (EC), eyes open (EO), and eyes fixated on a target (EO-F), each presenting distinct neuronal activity patterns. It is widely acknowledged that significant differences exist between these three states, making the classification of eye open/closed states a robust basis for verifying models that can be used for diagnostic purposes. We have studied the performance of graph neural networks (GNNs) in identifying dissimilarities between eyes closed and fixated conditions. Additionally, we employ interpretation algorithms to gain insights into the crucial edges influencing the GNN model’s classification. Our proposed GNN model achieves an accuracy of up to 81% in distinguishing between these conditions, with notable brain regions, including visual networks, the default mode network, and the frontoparietal cognitive control network, playing a vital role in accurate classification, consistent with findings from existing literature. Our research highlights the potential of GNNs as a promising approach for exploring functional connectivity differences in RS-fMRI data." @default.
- W4387531603 created "2023-10-12" @default.
- W4387531603 creator A5012949975 @default.
- W4387531603 creator A5023992495 @default.
- W4387531603 creator A5046298262 @default.
- W4387531603 creator A5048101105 @default.
- W4387531603 creator A5063940980 @default.
- W4387531603 date "2023-01-01" @default.
- W4387531603 modified "2023-10-12" @default.
- W4387531603 title "Graph Neural Networks for Analysis of rs-fMRI Differences in Open vs Closed Conditions" @default.
- W4387531603 cites W1973776237 @default.
- W4387531603 cites W2005238835 @default.
- W4387531603 cites W2038307665 @default.
- W4387531603 cites W2045506786 @default.
- W4387531603 cites W2068375947 @default.
- W4387531603 cites W2079538322 @default.
- W4387531603 cites W2082714165 @default.
- W4387531603 cites W2130010412 @default.
- W4387531603 cites W2295107390 @default.
- W4387531603 cites W2499800833 @default.
- W4387531603 cites W2805985513 @default.
- W4387531603 cites W2911677091 @default.
- W4387531603 cites W2951582858 @default.
- W4387531603 cites W3039011740 @default.
- W4387531603 cites W3048074838 @default.
- W4387531603 cites W3199008037 @default.
- W4387531603 cites W3211585221 @default.
- W4387531603 cites W4211091012 @default.
- W4387531603 cites W4220685893 @default.
- W4387531603 cites W4285707174 @default.
- W4387531603 doi "https://doi.org/10.1007/978-3-031-44865-2_22" @default.
- W4387531603 hasPublicationYear "2023" @default.
- W4387531603 type Work @default.
- W4387531603 citedByCount "0" @default.
- W4387531603 crossrefType "book-chapter" @default.
- W4387531603 hasAuthorship W4387531603A5012949975 @default.
- W4387531603 hasAuthorship W4387531603A5023992495 @default.
- W4387531603 hasAuthorship W4387531603A5046298262 @default.
- W4387531603 hasAuthorship W4387531603A5048101105 @default.
- W4387531603 hasAuthorship W4387531603A5063940980 @default.
- W4387531603 hasConcept C119857082 @default.
- W4387531603 hasConcept C132525143 @default.
- W4387531603 hasConcept C153180895 @default.
- W4387531603 hasConcept C154945302 @default.
- W4387531603 hasConcept C15744967 @default.
- W4387531603 hasConcept C169760540 @default.
- W4387531603 hasConcept C169900460 @default.
- W4387531603 hasConcept C2779226451 @default.
- W4387531603 hasConcept C41008148 @default.
- W4387531603 hasConcept C58693492 @default.
- W4387531603 hasConcept C66324658 @default.
- W4387531603 hasConcept C80444323 @default.
- W4387531603 hasConceptScore W4387531603C119857082 @default.
- W4387531603 hasConceptScore W4387531603C132525143 @default.
- W4387531603 hasConceptScore W4387531603C153180895 @default.
- W4387531603 hasConceptScore W4387531603C154945302 @default.
- W4387531603 hasConceptScore W4387531603C15744967 @default.
- W4387531603 hasConceptScore W4387531603C169760540 @default.
- W4387531603 hasConceptScore W4387531603C169900460 @default.
- W4387531603 hasConceptScore W4387531603C2779226451 @default.
- W4387531603 hasConceptScore W4387531603C41008148 @default.
- W4387531603 hasConceptScore W4387531603C58693492 @default.
- W4387531603 hasConceptScore W4387531603C66324658 @default.
- W4387531603 hasConceptScore W4387531603C80444323 @default.
- W4387531603 hasLocation W43875316031 @default.
- W4387531603 hasOpenAccess W4387531603 @default.
- W4387531603 hasPrimaryLocation W43875316031 @default.
- W4387531603 hasRelatedWork W1964455563 @default.
- W4387531603 hasRelatedWork W2168298321 @default.
- W4387531603 hasRelatedWork W2294986132 @default.
- W4387531603 hasRelatedWork W2319066238 @default.
- W4387531603 hasRelatedWork W2365936003 @default.
- W4387531603 hasRelatedWork W2371524820 @default.
- W4387531603 hasRelatedWork W2387620927 @default.
- W4387531603 hasRelatedWork W2601707947 @default.
- W4387531603 hasRelatedWork W2790620361 @default.
- W4387531603 hasRelatedWork W4386286677 @default.
- W4387531603 isParatext "false" @default.
- W4387531603 isRetracted "false" @default.
- W4387531603 workType "book-chapter" @default.