Matches in SemOpenAlex for { <https://semopenalex.org/work/W2955388575> ?p ?o ?g. }
- W2955388575 abstract "We have designed di erent heuristics for both searching on Massive graphs andregularizing Deep Neural Networks in this work.Both the problem of nding a minimum vertex cover (MinVC) and the maximumedge weight clique (MEWC) in a graph are prominent NP-hard problems of greatimportance in both theory and application. During recent decades, there has beenmuch interest in nding optimal or near-optimal solutions to these two problems.Many existing heuristic algorithms for MinVC are based on local search strategies.An algorithm called FastVC takes a rst step towards solving the MinVCproblem for large real-world graphs. However, FastVC may be trapped at localminima during the local search stage due to the lack of suitable diversi cationmechanisms. Besides, since the traditional best-picking heuristic was believed tobe of high complexity, FastVC replaces it with an approximate best-picking strategy.However, best-picking has been proved to be robust for a wide range ofproblems, so abandoning it may be a great sacri ce. Therefore, we rstly designa diversi cation heuristic to help FastVC escape from local minima, and the proposedsolver is named WalkVC. Secondly, we develop a local search MinVC solver,named NoiseVC, which utilizes best-picking (low complexity) with noise to removevertices during the local search stage in massive graphs. On the other hand, mostof existing heuristics for the MEWC problem focus on academic benchmarks withrelatively size. However, very little attention was paid to solving the MEWCproblem in large sparse graphs. In this thesis, we exploit the so-called deterministictournament selection (DTS) heuristic for selecting edges to improve the localsearch based MEWC algorithms.Deep Neural Networks (DNN), have an extremely large number of parameterscomparing with traditional machine earning methods, su er from the the problemof over tting. Dropout [Hinton et al., 2012, Srivastava et al., 2014] has been proposedto address this problem. Dropout is an useful technique for regularizing andpreventing the co-adaptation of neurons in DNN. It randomly drops units with aprobability p during the training stage of DNN to avoid over tting. The working mechanism of dropout can be interpreted as approximately and exponentially combiningmany di erent neural network architectures e ciently, leading to a powerfulensemble. We propose a novel diversi cation strategy for dropout named TabuDropout, which aims at generating more di erent neural network architectures infewer numbers of iterations. Besides, a recent work named Curriculum Dropoutachieves the state-of-the-art performance among the dropout variants by using ascheduled p instead of a xed one. It gradually increases the dropping probabilityfrom 0 to 1 ���� p according to a time scheduling from curriculum learning. Theprimary intuition is that dropout seems unnecessary at the beginning of trainingand Curriculum Dropout starts training the whole neural networks without dropping,which is called starting easy. In this thesis, we design a new scheduleddropout strategy using starting small instead of starting easy, which graduallydecreases the dropping probability from 1 to p. We call this strategy AnnealedCurriculum Dropout.Experiments conducted on related public standard datasets show that our proposedheuristics for both searching on massive graphs and regularizing deep learninghave achieved better performance than the comparison methods." @default.
- W2955388575 created "2019-07-12" @default.
- W2955388575 creator A5006652992 @default.
- W2955388575 date "2018-12-01" @default.
- W2955388575 modified "2023-09-27" @default.
- W2955388575 title "Searching on Massive Graphs and Regularizing Deep Learning" @default.
- W2955388575 cites W1505245505 @default.
- W2955388575 cites W1508964797 @default.
- W2955388575 cites W1509289098 @default.
- W2955388575 cites W2015861736 @default.
- W2955388575 cites W2026320281 @default.
- W2955388575 cites W2037016941 @default.
- W2955388575 cites W2038550366 @default.
- W2955388575 cites W2046112356 @default.
- W2955388575 cites W2062989416 @default.
- W2955388575 cites W2068617151 @default.
- W2955388575 cites W2072210264 @default.
- W2955388575 cites W2076063813 @default.
- W2955388575 cites W2078797727 @default.
- W2955388575 cites W2079439262 @default.
- W2955388575 cites W2094314299 @default.
- W2955388575 cites W2094836579 @default.
- W2955388575 cites W2095705004 @default.
- W2955388575 cites W2098667053 @default.
- W2955388575 cites W2104670598 @default.
- W2955388575 cites W2108598243 @default.
- W2955388575 cites W2110504071 @default.
- W2955388575 cites W2112796928 @default.
- W2955388575 cites W2116619380 @default.
- W2955388575 cites W2117130368 @default.
- W2955388575 cites W2119089575 @default.
- W2955388575 cites W2136552606 @default.
- W2955388575 cites W2136836265 @default.
- W2955388575 cites W2150341604 @default.
- W2955388575 cites W2156252543 @default.
- W2955388575 cites W2160660594 @default.
- W2955388575 cites W2165372647 @default.
- W2955388575 cites W2171928131 @default.
- W2955388575 cites W2211710464 @default.
- W2955388575 cites W2251648435 @default.
- W2955388575 cites W2294347342 @default.
- W2955388575 cites W2321167683 @default.
- W2955388575 cites W2401610261 @default.
- W2955388575 cites W2409027918 @default.
- W2955388575 cites W2476141096 @default.
- W2955388575 cites W2513955757 @default.
- W2955388575 cites W2532064612 @default.
- W2955388575 cites W2604438066 @default.
- W2955388575 cites W2610880400 @default.
- W2955388575 cites W2624413595 @default.
- W2955388575 cites W2726308634 @default.
- W2955388575 cites W2750384547 @default.
- W2955388575 cites W2781596748 @default.
- W2955388575 cites W2800112096 @default.
- W2955388575 cites W2919115771 @default.
- W2955388575 cites W2962800229 @default.
- W2955388575 cites W3004540582 @default.
- W2955388575 cites W3099305697 @default.
- W2955388575 cites W3118608800 @default.
- W2955388575 cites W391985582 @default.
- W2955388575 cites W4919037 @default.
- W2955388575 cites W75795297 @default.
- W2955388575 doi "https://doi.org/10.25904/1912/835" @default.
- W2955388575 hasPublicationYear "2018" @default.
- W2955388575 type Work @default.
- W2955388575 sameAs 2955388575 @default.
- W2955388575 citedByCount "0" @default.
- W2955388575 crossrefType "dissertation" @default.
- W2955388575 hasAuthorship W2955388575A5006652992 @default.
- W2955388575 hasConcept C102192266 @default.
- W2955388575 hasConcept C11413529 @default.
- W2955388575 hasConcept C126255220 @default.
- W2955388575 hasConcept C127705205 @default.
- W2955388575 hasConcept C132525143 @default.
- W2955388575 hasConcept C134306372 @default.
- W2955388575 hasConcept C135320971 @default.
- W2955388575 hasConcept C150997102 @default.
- W2955388575 hasConcept C154945302 @default.
- W2955388575 hasConcept C160446614 @default.
- W2955388575 hasConcept C173801870 @default.
- W2955388575 hasConcept C186633575 @default.
- W2955388575 hasConcept C2778770139 @default.
- W2955388575 hasConcept C33923547 @default.
- W2955388575 hasConcept C40687702 @default.
- W2955388575 hasConcept C41008148 @default.
- W2955388575 hasConcept C80444323 @default.
- W2955388575 hasConcept C80899671 @default.
- W2955388575 hasConceptScore W2955388575C102192266 @default.
- W2955388575 hasConceptScore W2955388575C11413529 @default.
- W2955388575 hasConceptScore W2955388575C126255220 @default.
- W2955388575 hasConceptScore W2955388575C127705205 @default.
- W2955388575 hasConceptScore W2955388575C132525143 @default.
- W2955388575 hasConceptScore W2955388575C134306372 @default.
- W2955388575 hasConceptScore W2955388575C135320971 @default.
- W2955388575 hasConceptScore W2955388575C150997102 @default.
- W2955388575 hasConceptScore W2955388575C154945302 @default.
- W2955388575 hasConceptScore W2955388575C160446614 @default.
- W2955388575 hasConceptScore W2955388575C173801870 @default.
- W2955388575 hasConceptScore W2955388575C186633575 @default.
- W2955388575 hasConceptScore W2955388575C2778770139 @default.