Matches in SemOpenAlex for { <https://semopenalex.org/work/W4286231465> ?p ?o ?g. }
Showing items 1 to 65 of
65
with 100 items per page.
- W4286231465 abstract "Graph Neural Networks (GNNs) tend to suffer from high computation costs due to the exponentially increasing scale of graph data and the number of model parameters, which restricts their utility in practical applications. To this end, some recent works focus on sparsifying GNNs with the lottery ticket hypothesis (LTH) to reduce inference costs while maintaining performance levels. However, the LTH-based methods suffer from two major drawbacks: 1) they require exhaustive and iterative training of dense models, resulting in an extremely large training computation cost, and 2) they only trim graph structures and model parameters but ignore the node feature dimension, where significant redundancy exists. To overcome the above limitations, we propose a comprehensive graph gradual pruning framework termed CGP. This is achieved by designing a during-training graph pruning paradigm to dynamically prune GNNs within one training process. Unlike LTH-based methods, the proposed CGP approach requires no re-training, which significantly reduces the computation costs. Furthermore, we design a co-sparsifying strategy to comprehensively trim all three core elements of GNNs: graph structures, node features, and model parameters. Meanwhile, aiming at refining the pruning operation, we introduce a regrowth process into our CGP framework, in order to re-establish the pruned but important connections. The proposed CGP is evaluated by using a node classification task across 6 GNN architectures, including shallow models (GCN and GAT), shallow-but-deep-propagation models (SGC and APPNP), and deep models (GCNII and ResGCN), on a total of 14 real-world graph datasets, including large-scale graph datasets from the challenging Open Graph Benchmark. Experiments reveal that our proposed strategy greatly improves both training and inference efficiency while matching or even exceeding the accuracy of existing methods." @default.
- W4286231465 created "2022-07-21" @default.
- W4286231465 creator A5005792080 @default.
- W4286231465 creator A5046992958 @default.
- W4286231465 creator A5067600725 @default.
- W4286231465 creator A5069789783 @default.
- W4286231465 creator A5071287470 @default.
- W4286231465 creator A5073642517 @default.
- W4286231465 creator A5074672983 @default.
- W4286231465 creator A5086939495 @default.
- W4286231465 date "2022-07-18" @default.
- W4286231465 modified "2023-09-23" @default.
- W4286231465 title "Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural Networks" @default.
- W4286231465 doi "https://doi.org/10.48550/arxiv.2207.08629" @default.
- W4286231465 hasPublicationYear "2022" @default.
- W4286231465 type Work @default.
- W4286231465 citedByCount "0" @default.
- W4286231465 crossrefType "posted-content" @default.
- W4286231465 hasAuthorship W4286231465A5005792080 @default.
- W4286231465 hasAuthorship W4286231465A5046992958 @default.
- W4286231465 hasAuthorship W4286231465A5067600725 @default.
- W4286231465 hasAuthorship W4286231465A5069789783 @default.
- W4286231465 hasAuthorship W4286231465A5071287470 @default.
- W4286231465 hasAuthorship W4286231465A5073642517 @default.
- W4286231465 hasAuthorship W4286231465A5074672983 @default.
- W4286231465 hasAuthorship W4286231465A5086939495 @default.
- W4286231465 hasBestOaLocation W42862314651 @default.
- W4286231465 hasConcept C108010975 @default.
- W4286231465 hasConcept C11413529 @default.
- W4286231465 hasConcept C119857082 @default.
- W4286231465 hasConcept C132525143 @default.
- W4286231465 hasConcept C154945302 @default.
- W4286231465 hasConcept C2776214188 @default.
- W4286231465 hasConcept C41008148 @default.
- W4286231465 hasConcept C45374587 @default.
- W4286231465 hasConcept C6557445 @default.
- W4286231465 hasConcept C80444323 @default.
- W4286231465 hasConcept C86803240 @default.
- W4286231465 hasConceptScore W4286231465C108010975 @default.
- W4286231465 hasConceptScore W4286231465C11413529 @default.
- W4286231465 hasConceptScore W4286231465C119857082 @default.
- W4286231465 hasConceptScore W4286231465C132525143 @default.
- W4286231465 hasConceptScore W4286231465C154945302 @default.
- W4286231465 hasConceptScore W4286231465C2776214188 @default.
- W4286231465 hasConceptScore W4286231465C41008148 @default.
- W4286231465 hasConceptScore W4286231465C45374587 @default.
- W4286231465 hasConceptScore W4286231465C6557445 @default.
- W4286231465 hasConceptScore W4286231465C80444323 @default.
- W4286231465 hasConceptScore W4286231465C86803240 @default.
- W4286231465 hasLocation W42862314651 @default.
- W4286231465 hasOpenAccess W4286231465 @default.
- W4286231465 hasPrimaryLocation W42862314651 @default.
- W4286231465 hasRelatedWork W11849241 @default.
- W4286231465 hasRelatedWork W12563130 @default.
- W4286231465 hasRelatedWork W12904111 @default.
- W4286231465 hasRelatedWork W14632104 @default.
- W4286231465 hasRelatedWork W1674447 @default.
- W4286231465 hasRelatedWork W2177595 @default.
- W4286231465 hasRelatedWork W2956227 @default.
- W4286231465 hasRelatedWork W3979659 @default.
- W4286231465 hasRelatedWork W4529005 @default.
- W4286231465 hasRelatedWork W9043603 @default.
- W4286231465 isParatext "false" @default.
- W4286231465 isRetracted "false" @default.
- W4286231465 workType "article" @default.