Matches in SemOpenAlex for { <https://semopenalex.org/work/W55941092> ?p ?o ?g. }
Showing items 1 to 91 of
91
with 100 items per page.
- W55941092 endingPage "465" @default.
- W55941092 startingPage "463" @default.
- W55941092 abstract "In this paper, we apply artificial neural networks to control the targeting system of a robotic tank in a tank-combat computer game (RoboCode). We suggest an algorithm that not only trains the connection weights of the neural network, but simultaneously searches for an optimum network architecture. Our hybrid evolutionary algorithm (PSONet) uses modified particle swarm optimisation to train the connection weights and four architecture mutation operators to evolve the appropriate architecture of the network, together with a new fitness function to guide the evolution. Introduction and Background Artificial Neural Networks (ANNs) have been used in a variety of areas during the last thirty years (Meyer 1998; Russell & Norvig 2003; Scapura 1995), more recently in computer games to improve the quality of the artificial intelligence engine in these games (Schaeffer 2000). This paper discusses the application of ANNs to control the targeting system of a robotic tank in a tank-combat game, using the Robocode environment (Robocode 2005; Robowiki 2005) as a platform. ANNs have the ability to learn over time and therefore to adapt to new situations and strategies. In general, the structure of an ANN determines its performance. Some traditional algorithms use a fixed structure and only train the weights of the connections to optimise the network. Others discover a relative optimum architecture first and then train the weights on this architecture (Koza & Rice 1991; Odri, Petrovacki, & Krstonosic 1993; Yao & Liu 1997). Since these algorithms are very prone to overfitting or convergence on local optima, we suggest to apply a hybridized and evolutionary algorithm (PSONet), which simultaneously finds the best structure for the ANN and optimal weights for its connections by using a new fitness function. Methodology and Architecture We restrict ourselves here to fully connected multilayer feedforward networks, i.e., neural networks in which information is passed from the input nodes through the hidden nodes to the output nodes. Theoretical results show that, Copyright c © 2006, American Association for Artificial Intelligence (www.aaai.org). All rights reserved. given enough hidden nodes, such a network can approximate any reasonable function to any required degree of accuracy. This is usually achieved by training the network with an error backpropagation algorithm (Scapura 1995). Since backpropagation just optimizes the weights of the connections on a predefined neural network architecture, it is difficult to avoid the underfitting or overfitting problem. An evolutionary approach (Salomon 1998), like particle swarm optimisation, can overcome these problems. Particle swarm optimisation (PSO) is a stochastic global optimization technique inspired by the social behavior of bird flocking (Kennedy & Eberhart 1995; Shen et al. 2004; Zhang, Shao, & Li 2000). The particles share information with each other, in particular information about the quality of the solutions they have found at specific points in the search space. The best solution discovered by a specific particle is referred to as the personal best solution. Particles move towards other personal best solutions with certain velocities in order to discover improved solutions. We propose a modified particle swarm optimisation algorithm with an annealing factor combined with architecture mutation operators. The proposed approach optimises the connection weights and the architectures of the neural networks simultaneously and thereby avoids the problem of slow convergence speed and the tendency to overfitting. The particle swarm optimisation algorithm is used for training the weights of the neural networks, whereas the architecture mutation operators (hidden node deletion, connection deletion, connection addition, and hidden node addition) are applied to find the optimal network structure. The individual steps of our algorithm, called PSONet, are summarized in Figure 1. The efficiency and quality of the algorithm depends significantly on the fitness function used to rank the neural networks. It is based on two factors: the prediction accuracy and the complexity of the network. The accuracy of a neural network is defined by the rootmean-square error (RMSE): RMSE = √∑ i,j(Oij − Tij) S ·N where Oij and Tij are the actual value and target value, respectively, for the jth output in the i training example. S is the size of training set and N the number of output nodes." @default.
- W55941092 created "2016-06-24" @default.
- W55941092 creator A5013162759 @default.
- W55941092 creator A5058596166 @default.
- W55941092 date "2006-01-01" @default.
- W55941092 modified "2023-09-26" @default.
- W55941092 title "An Artificial Neural Network for a Tank Targeting System." @default.
- W55941092 cites W120239626 @default.
- W55941092 cites W1588537248 @default.
- W55941092 cites W1978004231 @default.
- W55941092 cites W2006145851 @default.
- W55941092 cites W2116640126 @default.
- W55941092 cites W2122410182 @default.
- W55941092 cites W2127011380 @default.
- W55941092 cites W2134514463 @default.
- W55941092 cites W2136271046 @default.
- W55941092 cites W2167662333 @default.
- W55941092 hasPublicationYear "2006" @default.
- W55941092 type Work @default.
- W55941092 sameAs 55941092 @default.
- W55941092 citedByCount "0" @default.
- W55941092 crossrefType "proceedings-article" @default.
- W55941092 hasAuthorship W55941092A5013162759 @default.
- W55941092 hasAuthorship W55941092A5058596166 @default.
- W55941092 hasConcept C105902424 @default.
- W55941092 hasConcept C119857082 @default.
- W55941092 hasConcept C123657996 @default.
- W55941092 hasConcept C142362112 @default.
- W55941092 hasConcept C153349607 @default.
- W55941092 hasConcept C154945302 @default.
- W55941092 hasConcept C159149176 @default.
- W55941092 hasConcept C162324750 @default.
- W55941092 hasConcept C190839683 @default.
- W55941092 hasConcept C193415008 @default.
- W55941092 hasConcept C205649164 @default.
- W55941092 hasConcept C22019652 @default.
- W55941092 hasConcept C2777303404 @default.
- W55941092 hasConcept C31258907 @default.
- W55941092 hasConcept C41008148 @default.
- W55941092 hasConcept C50522688 @default.
- W55941092 hasConcept C50644808 @default.
- W55941092 hasConcept C58640448 @default.
- W55941092 hasConcept C85617194 @default.
- W55941092 hasConceptScore W55941092C105902424 @default.
- W55941092 hasConceptScore W55941092C119857082 @default.
- W55941092 hasConceptScore W55941092C123657996 @default.
- W55941092 hasConceptScore W55941092C142362112 @default.
- W55941092 hasConceptScore W55941092C153349607 @default.
- W55941092 hasConceptScore W55941092C154945302 @default.
- W55941092 hasConceptScore W55941092C159149176 @default.
- W55941092 hasConceptScore W55941092C162324750 @default.
- W55941092 hasConceptScore W55941092C190839683 @default.
- W55941092 hasConceptScore W55941092C193415008 @default.
- W55941092 hasConceptScore W55941092C205649164 @default.
- W55941092 hasConceptScore W55941092C22019652 @default.
- W55941092 hasConceptScore W55941092C2777303404 @default.
- W55941092 hasConceptScore W55941092C31258907 @default.
- W55941092 hasConceptScore W55941092C41008148 @default.
- W55941092 hasConceptScore W55941092C50522688 @default.
- W55941092 hasConceptScore W55941092C50644808 @default.
- W55941092 hasConceptScore W55941092C58640448 @default.
- W55941092 hasConceptScore W55941092C85617194 @default.
- W55941092 hasLocation W559410921 @default.
- W55941092 hasOpenAccess W55941092 @default.
- W55941092 hasPrimaryLocation W559410921 @default.
- W55941092 hasRelatedWork W1504403716 @default.
- W55941092 hasRelatedWork W1925618211 @default.
- W55941092 hasRelatedWork W1981009593 @default.
- W55941092 hasRelatedWork W2049105312 @default.
- W55941092 hasRelatedWork W209715872 @default.
- W55941092 hasRelatedWork W2133584084 @default.
- W55941092 hasRelatedWork W2144091660 @default.
- W55941092 hasRelatedWork W2183076374 @default.
- W55941092 hasRelatedWork W2280350628 @default.
- W55941092 hasRelatedWork W2548869838 @default.
- W55941092 hasRelatedWork W2549539357 @default.
- W55941092 hasRelatedWork W2754920661 @default.
- W55941092 hasRelatedWork W2949852249 @default.
- W55941092 hasRelatedWork W2991904866 @default.
- W55941092 hasRelatedWork W3090108433 @default.
- W55941092 hasRelatedWork W3184152003 @default.
- W55941092 hasRelatedWork W3206974103 @default.
- W55941092 hasRelatedWork W42067512 @default.
- W55941092 hasRelatedWork W586773592 @default.
- W55941092 hasRelatedWork W2926722495 @default.
- W55941092 isParatext "false" @default.
- W55941092 isRetracted "false" @default.
- W55941092 magId "55941092" @default.
- W55941092 workType "article" @default.