Matches in SemOpenAlex for { <https://semopenalex.org/work/W4308159588> ?p ?o ?g. }
- W4308159588 endingPage "4622" @default.
- W4308159588 startingPage "4519" @default.
- W4308159588 abstract "The learning process and hyper-parameter optimization of artificial neural networks (ANNs) and deep learning (DL) architectures is considered one of the most challenging machine learning problems. Several past studies have used gradient-based back propagation methods to train DL architectures. However, gradient-based methods have major drawbacks such as stucking at local minimums in multi-objective cost functions, expensive execution time due to calculating gradient information with thousands of iterations and needing the cost functions to be continuous. Since training the ANNs and DLs is an NP-hard optimization problem, their structure and parameters optimization using the meta-heuristic (MH) algorithms has been considerably raised. MH algorithms can accurately formulate the optimal estimation of DL components (such as hyper-parameter, weights, number of layers, number of neurons, learning rate, etc.). This paper provides a comprehensive review of the optimization of ANNs and DLs using MH algorithms. In this paper, we have reviewed the latest developments in the use of MH algorithms in the DL and ANN methods, presented their disadvantages and advantages, and pointed out some research directions to fill the gaps between MHs and DL methods. Moreover, it has been explained that the evolutionary hybrid architecture still has limited applicability in the literature. Also, this paper classifies the latest MH algorithms in the literature to demonstrate their effectiveness in DL and ANN training for various applications. Most researchers tend to extend novel hybrid algorithms by combining MHs to optimize the hyper-parameters of DLs and ANNs. The development of hybrid MHs helps improving algorithms performance and capable of solving complex optimization problems. In general, the optimal performance of the MHs should be able to achieve a suitable trade-off between exploration and exploitation features. Hence, this paper tries to summarize various MH algorithms in terms of the convergence trend, exploration, exploitation, and the ability to avoid local minima. The integration of MH with DLs is expected to accelerate the training process in the coming few years. However, relevant publications in this way are still rare." @default.
- W4308159588 created "2022-11-08" @default.
- W4308159588 creator A5013652471 @default.
- W4308159588 creator A5041320440 @default.
- W4308159588 date "2022-10-31" @default.
- W4308159588 modified "2023-10-14" @default.
- W4308159588 title "Application of Meta-Heuristic Algorithms for Training Neural Networks and Deep Learning Architectures: A Comprehensive Review" @default.
- W4308159588 cites W1127095158 @default.
- W4308159588 cites W1203059390 @default.
- W4308159588 cites W139960808 @default.
- W4308159588 cites W1490180010 @default.
- W4308159588 cites W1523741643 @default.
- W4308159588 cites W1528419590 @default.
- W4308159588 cites W1538895518 @default.
- W4308159588 cites W1544152422 @default.
- W4308159588 cites W1579299488 @default.
- W4308159588 cites W1595159159 @default.
- W4308159588 cites W16088600 @default.
- W4308159588 cites W1614673366 @default.
- W4308159588 cites W1617447347 @default.
- W4308159588 cites W1623397821 @default.
- W4308159588 cites W1772612928 @default.
- W4308159588 cites W1801780804 @default.
- W4308159588 cites W1840945935 @default.
- W4308159588 cites W1888884532 @default.
- W4308159588 cites W1936687626 @default.
- W4308159588 cites W1959523037 @default.
- W4308159588 cites W1965292083 @default.
- W4308159588 cites W1966124573 @default.
- W4308159588 cites W1966198712 @default.
- W4308159588 cites W1967402817 @default.
- W4308159588 cites W1968524744 @default.
- W4308159588 cites W1969487735 @default.
- W4308159588 cites W1971259134 @default.
- W4308159588 cites W1973141564 @default.
- W4308159588 cites W1974803434 @default.
- W4308159588 cites W1976744965 @default.
- W4308159588 cites W1977113083 @default.
- W4308159588 cites W1977707084 @default.
- W4308159588 cites W1977737282 @default.
- W4308159588 cites W1980058421 @default.
- W4308159588 cites W1980067289 @default.
- W4308159588 cites W1984049995 @default.
- W4308159588 cites W1984691417 @default.
- W4308159588 cites W1985460844 @default.
- W4308159588 cites W1985578208 @default.
- W4308159588 cites W1985658808 @default.
- W4308159588 cites W1985773453 @default.
- W4308159588 cites W1987115965 @default.
- W4308159588 cites W1987218880 @default.
- W4308159588 cites W1988539111 @default.
- W4308159588 cites W1993885071 @default.
- W4308159588 cites W1994453101 @default.
- W4308159588 cites W1999392207 @default.
- W4308159588 cites W2001646920 @default.
- W4308159588 cites W2001912407 @default.
- W4308159588 cites W2002302337 @default.
- W4308159588 cites W2005972692 @default.
- W4308159588 cites W2006145851 @default.
- W4308159588 cites W2006544565 @default.
- W4308159588 cites W2007898191 @default.
- W4308159588 cites W2010334716 @default.
- W4308159588 cites W2010541215 @default.
- W4308159588 cites W2012562359 @default.
- W4308159588 cites W2014385984 @default.
- W4308159588 cites W2015265162 @default.
- W4308159588 cites W2015304908 @default.
- W4308159588 cites W2017196268 @default.
- W4308159588 cites W2020329107 @default.
- W4308159588 cites W2021309800 @default.
- W4308159588 cites W2021473921 @default.
- W4308159588 cites W2021826571 @default.
- W4308159588 cites W2022290342 @default.
- W4308159588 cites W2023591337 @default.
- W4308159588 cites W2024060531 @default.
- W4308159588 cites W2024916621 @default.
- W4308159588 cites W2031014442 @default.
- W4308159588 cites W2032944328 @default.
- W4308159588 cites W2033011996 @default.
- W4308159588 cites W2033731173 @default.
- W4308159588 cites W2036490237 @default.
- W4308159588 cites W2039249480 @default.
- W4308159588 cites W2039797595 @default.
- W4308159588 cites W2041724917 @default.
- W4308159588 cites W2042656126 @default.
- W4308159588 cites W2043438873 @default.
- W4308159588 cites W2046352503 @default.
- W4308159588 cites W2046904226 @default.
- W4308159588 cites W2048570605 @default.
- W4308159588 cites W2050195882 @default.
- W4308159588 cites W2050426625 @default.
- W4308159588 cites W2051152418 @default.
- W4308159588 cites W2054132868 @default.
- W4308159588 cites W2055756687 @default.
- W4308159588 cites W2056308903 @default.
- W4308159588 cites W2056811412 @default.
- W4308159588 cites W2058906820 @default.
- W4308159588 cites W2058986669 @default.