Matches in SemOpenAlex for { <https://semopenalex.org/work/W4288804522> ?p ?o ?g. }
Showing items 1 to 63 of
63
with 100 items per page.
- W4288804522 abstract "For over a decade now, robotics and the use of artificial agents have become a common thing.Testing the performance of new path finding or search space optimization algorithms has also become a challenge as they require simulation or an environment to test them.The creation of artificial environments with artificial agents is one of the methods employed to test such algorithms.Games have also become an environment to test them.The performance of the algorithms can be compared by using artificial agents that will behave according to the algorithm in the environment they are put in.The performance parameters can be, how quickly the agent is able to differentiate between rewarding actions and hostile actions.This can be tested by placing the agent in an environment with different types of hurdles and the goal of the agent is to reach the farthest by taking decisions on actions that will lead to avoiding all the obstacles.The environment chosen is a game called Flappy Bird.The goal of the game is to make the bird fly through a set of pipes of random heights.The bird must go in between these pipes and must not hit the top, the bottom, or the pipes themselves.The actions that the bird can take are either to flap its wings or drop down with gravity.The algorithms that are enforced on the artificial agents are NeuroEvolution of Augmenting Topologies (NEAT) and Reinforcement Learning.The NEAT algorithm takes an N initial population of artificial agents.They follow genetic algorithms by considering an objective function, crossover, mutation, and augmenting topologies.Reinforcement learning, on the other hand, remembers the state, the action taken at that state, and the reward received for the action taken using a single agent and a Deep Q-learning Network.The performance of the NEAT algorithm improves as the initial population of the artificial agents is increased." @default.
- W4288804522 created "2022-07-30" @default.
- W4288804522 creator A5002730979 @default.
- W4288804522 creator A5059744082 @default.
- W4288804522 date "2022-07-28" @default.
- W4288804522 modified "2023-09-26" @default.
- W4288804522 title "Playing a 2D Game Indefinitely using NEAT and Reinforcement Learning" @default.
- W4288804522 doi "https://doi.org/10.48550/arxiv.2207.14140" @default.
- W4288804522 hasPublicationYear "2022" @default.
- W4288804522 type Work @default.
- W4288804522 citedByCount "0" @default.
- W4288804522 crossrefType "posted-content" @default.
- W4288804522 hasAuthorship W4288804522A5002730979 @default.
- W4288804522 hasAuthorship W4288804522A5059744082 @default.
- W4288804522 hasBestOaLocation W42888045221 @default.
- W4288804522 hasConcept C118070581 @default.
- W4288804522 hasConcept C119857082 @default.
- W4288804522 hasConcept C13687954 @default.
- W4288804522 hasConcept C144024400 @default.
- W4288804522 hasConcept C149923435 @default.
- W4288804522 hasConcept C154945302 @default.
- W4288804522 hasConcept C177264268 @default.
- W4288804522 hasConcept C199360897 @default.
- W4288804522 hasConcept C2777735758 @default.
- W4288804522 hasConcept C2908647359 @default.
- W4288804522 hasConcept C34413123 @default.
- W4288804522 hasConcept C41008148 @default.
- W4288804522 hasConcept C50644808 @default.
- W4288804522 hasConcept C8880873 @default.
- W4288804522 hasConcept C90509273 @default.
- W4288804522 hasConcept C97541855 @default.
- W4288804522 hasConceptScore W4288804522C118070581 @default.
- W4288804522 hasConceptScore W4288804522C119857082 @default.
- W4288804522 hasConceptScore W4288804522C13687954 @default.
- W4288804522 hasConceptScore W4288804522C144024400 @default.
- W4288804522 hasConceptScore W4288804522C149923435 @default.
- W4288804522 hasConceptScore W4288804522C154945302 @default.
- W4288804522 hasConceptScore W4288804522C177264268 @default.
- W4288804522 hasConceptScore W4288804522C199360897 @default.
- W4288804522 hasConceptScore W4288804522C2777735758 @default.
- W4288804522 hasConceptScore W4288804522C2908647359 @default.
- W4288804522 hasConceptScore W4288804522C34413123 @default.
- W4288804522 hasConceptScore W4288804522C41008148 @default.
- W4288804522 hasConceptScore W4288804522C50644808 @default.
- W4288804522 hasConceptScore W4288804522C8880873 @default.
- W4288804522 hasConceptScore W4288804522C90509273 @default.
- W4288804522 hasConceptScore W4288804522C97541855 @default.
- W4288804522 hasLocation W42888045221 @default.
- W4288804522 hasOpenAccess W4288804522 @default.
- W4288804522 hasPrimaryLocation W42888045221 @default.
- W4288804522 hasRelatedWork W10763751 @default.
- W4288804522 hasRelatedWork W1279312 @default.
- W4288804522 hasRelatedWork W13211703 @default.
- W4288804522 hasRelatedWork W1323832 @default.
- W4288804522 hasRelatedWork W1353223 @default.
- W4288804522 hasRelatedWork W2356913 @default.
- W4288804522 hasRelatedWork W5547603 @default.
- W4288804522 hasRelatedWork W5663487 @default.
- W4288804522 hasRelatedWork W6801678 @default.
- W4288804522 hasRelatedWork W9860846 @default.
- W4288804522 isParatext "false" @default.
- W4288804522 isRetracted "false" @default.
- W4288804522 workType "article" @default.