MIT Press Journal Article on Neural NetworksAbstractAn important question in neuroevolution is how to gain an advantage from evolvingneural network topologies along with weights. We present a method, NeuroEvolution of Augmenting Topologies (NEAT), which outperforms the best fixed-topologymethod on a challenging benchmark reinforcement learning task. We claim that theincreased efficiency is due to (1) employing a principled method of crossover of different topologies, (2) protecting structural innovation using speciation, and (3) incrementally growing from minimal structure. We test this claim through a series of ablationstudies that demonstrate that each component is necessary to the system as a wholeand to each other. What results is significantly faster learning. NEAT is also an important contribution to GAs because it shows how it is possible for evolution to bothoptimize and complexify solutions simultaneously, offering the possibility of evolvingincreasingly complex solutions over generations, and strengthening the analogy withbiological evolution.