Matches in SemOpenAlex for { <https://semopenalex.org/work/W4380346028> ?p ?o ?g. }
Showing items 1 to 75 of
75
with 100 items per page.
- W4380346028 abstract "Sparse training (ST) aims to improve deep learning by replacing fully connected artificial neural networks (ANNs) with sparse ones, akin to the structure of brain networks. Therefore, it might benefit to borrow brain-inspired learning paradigms from complex network intelligence theory. Epitopological learning (EL) is a field of network science that studies how to implement learning on networks by changing the shape of their connectivity structure (epitopological plasticity). One way to implement EL is via link prediction: predicting the existence likelihood of nonobserved links in a network. Cannistraci-Hebb (CH) learning theory inspired the CH3-L3 network automata rule for link prediction which is effective for generalpurpose link prediction. Here, starting from CH3-L3 we propose Epitopological Sparse Ultra-deep Learning (ESUL) to apply EL into sparse training. In empirical experiments, we find that ESUL learns ANNs with sparse hyperbolic topology in which emerges a community layer organization that is ultra-deep (meaning that also each layer has an internal depth due to power-law node hierarchy). Furthermore, we discover that ESUL automatically sparse the neurons during training (arriving even to 30% neurons left in hidden layers), this process of node dynamic removal is called percolation. Then we design CH training (CHT), a training methodology that put ESUL at its heart, with the aim to enhance prediction performance. CHT consists of 4 parts: (i) correlated sparse topological initialization (CSTI), to initialize the network with a hierarchical topology; (ii) sparse weighting initialization (SWI), to tailor weights initialization to a sparse topology; (iii) ESUL, to shape the ANN topology during training; (iv) early stop with weight refinement, to tune only weights once the topology reaches stability. We conduct experiments on 6 datasets and 3 network structures (MLPs, VGG16, Transformer) comparing CHT to sparse training SOTA method and fully connected network. By significantly reducing the node size while retaining performance, CHT represents the first example of parsimony sparse training." @default.
- W4380346028 created "2023-06-13" @default.
- W4380346028 creator A5007730744 @default.
- W4380346028 creator A5009598811 @default.
- W4380346028 creator A5032429023 @default.
- W4380346028 creator A5052734823 @default.
- W4380346028 creator A5091598741 @default.
- W4380346028 date "2023-06-09" @default.
- W4380346028 modified "2023-10-16" @default.
- W4380346028 title "Epitopological Sparse Ultra-Deep Learning: A Brain-Network Topological Theory Carves Communities in Sparse and Percolated Hyperbolic ANNs" @default.
- W4380346028 doi "https://doi.org/10.20944/preprints202207.0139.v2" @default.
- W4380346028 hasPublicationYear "2023" @default.
- W4380346028 type Work @default.
- W4380346028 citedByCount "0" @default.
- W4380346028 crossrefType "posted-content" @default.
- W4380346028 hasAuthorship W4380346028A5007730744 @default.
- W4380346028 hasAuthorship W4380346028A5009598811 @default.
- W4380346028 hasAuthorship W4380346028A5032429023 @default.
- W4380346028 hasAuthorship W4380346028A5052734823 @default.
- W4380346028 hasAuthorship W4380346028A5091598741 @default.
- W4380346028 hasBestOaLocation W43803460281 @default.
- W4380346028 hasConcept C108583219 @default.
- W4380346028 hasConcept C111919701 @default.
- W4380346028 hasConcept C114466953 @default.
- W4380346028 hasConcept C114614502 @default.
- W4380346028 hasConcept C119857082 @default.
- W4380346028 hasConcept C121332964 @default.
- W4380346028 hasConcept C154945302 @default.
- W4380346028 hasConcept C183115368 @default.
- W4380346028 hasConcept C184720557 @default.
- W4380346028 hasConcept C199360897 @default.
- W4380346028 hasConcept C199845137 @default.
- W4380346028 hasConcept C24890656 @default.
- W4380346028 hasConcept C33923547 @default.
- W4380346028 hasConcept C41008148 @default.
- W4380346028 hasConcept C50644808 @default.
- W4380346028 hasConcept C62520636 @default.
- W4380346028 hasConcept C62611344 @default.
- W4380346028 hasConcept C80444323 @default.
- W4380346028 hasConcept C97385483 @default.
- W4380346028 hasConceptScore W4380346028C108583219 @default.
- W4380346028 hasConceptScore W4380346028C111919701 @default.
- W4380346028 hasConceptScore W4380346028C114466953 @default.
- W4380346028 hasConceptScore W4380346028C114614502 @default.
- W4380346028 hasConceptScore W4380346028C119857082 @default.
- W4380346028 hasConceptScore W4380346028C121332964 @default.
- W4380346028 hasConceptScore W4380346028C154945302 @default.
- W4380346028 hasConceptScore W4380346028C183115368 @default.
- W4380346028 hasConceptScore W4380346028C184720557 @default.
- W4380346028 hasConceptScore W4380346028C199360897 @default.
- W4380346028 hasConceptScore W4380346028C199845137 @default.
- W4380346028 hasConceptScore W4380346028C24890656 @default.
- W4380346028 hasConceptScore W4380346028C33923547 @default.
- W4380346028 hasConceptScore W4380346028C41008148 @default.
- W4380346028 hasConceptScore W4380346028C50644808 @default.
- W4380346028 hasConceptScore W4380346028C62520636 @default.
- W4380346028 hasConceptScore W4380346028C62611344 @default.
- W4380346028 hasConceptScore W4380346028C80444323 @default.
- W4380346028 hasConceptScore W4380346028C97385483 @default.
- W4380346028 hasLocation W43803460281 @default.
- W4380346028 hasOpenAccess W4380346028 @default.
- W4380346028 hasPrimaryLocation W43803460281 @default.
- W4380346028 hasRelatedWork W1501213224 @default.
- W4380346028 hasRelatedWork W2126887587 @default.
- W4380346028 hasRelatedWork W2795261237 @default.
- W4380346028 hasRelatedWork W3082895349 @default.
- W4380346028 hasRelatedWork W3123344745 @default.
- W4380346028 hasRelatedWork W4210841218 @default.
- W4380346028 hasRelatedWork W4223943233 @default.
- W4380346028 hasRelatedWork W4302303815 @default.
- W4380346028 hasRelatedWork W4312200629 @default.
- W4380346028 hasRelatedWork W4380075502 @default.
- W4380346028 isParatext "false" @default.
- W4380346028 isRetracted "false" @default.
- W4380346028 workType "article" @default.