Neuroevolution, or neuro-evolution, is a form of artificial intelligence that uses evolutionary algorithms to generate artificial neural networks (ANN), parameters, and rules. It is most commonly applied in artificial life, general game playing and evolutionary robotics. The main benefit is that neuroevolution can be applied more widely than supervised learning algorithms, which require a syllabus of correct input-output pairs. In contrast, neuroevolution requires only a measure of a network's performance at a task. For example, the outcome of a game (i.e., whether one player won or lost) can be easily measured without providing labeled examples of desired strategies. Neuroevolution is commonly used as part of the reinforcement learning paradigm, and it can be contrasted with conventional deep learning techniques that use backpropagation (gradient descent on a neural network) with a fixed topology.
A separate distinction can be made between methods that evolve the structure of ANNs in parallel to its parameters (those applying standard evolutionary algorithms) and those that develop them separately (through memetic algorithms).
It can be shown that there is a correspondence between neuroevolution and gradient descent.
In direct encoding schemes the genotype directly maps to the phenotype. That is, every neuron and connection in the neural network is specified directly and explicitly in the genotype. In contrast, in indirect encoding schemes the genotype specifies indirectly how that network should be generated.
Indirect encodings are often used to achieve several aims:
Stanley and Miikkulainen propose a taxonomy for embryogenic systems that is intended to reflect their underlying properties. The taxonomy identifies five continuous dimensions, along which any embryogenic system can be placed:
Neuro-genetic evolution by E. Ronald, 1994 | Direct | Genetic algorithm | Network Weights |
Cellular Encoding (CE) by F. Gruau, 1994 | Indirect, embryogenic (grammar tree using ) | Genetic programming | Structure and parameters (simultaneous, complexification) |
GNARL by Angeline et al., 1994 | Direct | Evolutionary programming | Structure and parameters (simultaneous, complexification) |
EPNet by Yao and Liu, 1997 | Direct | Evolutionary programming (combined with backpropagation and simulated annealing) | Structure and parameters (mixed, complexification and simplification) |
NeuroEvolution of Augmenting Topologies (NEAT) by Stanley and Miikkulainen, 2002 | Direct | Genetic algorithm. Tracks genes with historical markings to allow crossover between different topologies, protects innovation via speciation. | Structure and parameters |
HyperNEAT (HyperNEAT) by Stanley, D'Ambrosio, Gauci, 2008 | Indirect, non-embryogenic (spatial patterns generated by a Compositional pattern-producing network (CPPN) within a hypercube are interpreted as connectivity patterns in a lower-dimensional space) | Genetic algorithm. The NEAT algorithm (above) is used to evolve the CPPN. | Parameters, structure fixed (functionally fully connected) |
ES-HyperNEAT (ES-HyperNEAT) by Risi, Stanley 2012 | Indirect, non-embryogenic (spatial patterns generated by a Compositional pattern-producing network (CPPN) within a hypercube are interpreted as connectivity patterns in a lower-dimensional space) | Genetic algorithm. The NEAT algorithm (above) is used to evolve the CPPN. | Parameters and network structure |
Evolutionary Acquisition of Neural Topologies (EANT/EANT2) by Kassahun and Sommer, 2005 / Siebel and Sommer, 2007 | Direct and indirect, potentially embryogenic (Common Genetic Encoding) | Evolutionary programming/Evolution strategies | Structure and parameters (separately, complexification) |
Interactively Constrained Neuro-Evolution (ICONE) by Rempis, 2012 | Direct, includes constraint masks to restrict the search to specific topology / parameter manifolds. | Evolutionary algorithm. Uses constraint masks to drastically reduce the search space through exploiting domain knowledge. | Structure and parameters (separately, complexification, interactive) |
Deus Ex Neural Network (DXNN) by Gene Sher, 2012 (2025). 9781461444626 ISBN 9781461444626 | Direct/Indirect, includes constraints, local tuning, and allows for evolution to integrate new sensors and actuators. | Memetic algorithm. Evolves network structure and parameters on different time-scales. | Structure and parameters (separately, complexification, interactive) |
Spectrum-diverse Unified Neuroevolution Architecture (SUNA) by Danilo Vasconcellos Vargas, Junichi Murata ( Download code) | Direct, introduces the Unified Neural Representation (representation integrating most of the neural network features from the literature). | Genetic Algorithm with a diversity preserving mechanism called Spectrum-diversity that scales well with chromosome size, is problem independent and focus more on obtaining diversity of high level behaviours/approaches. To achieve this diversity the concept of chromosome Spectrum is introduced and used together with a Novelty Map Population. | Structure and parameters (mixed, complexification and simplification) |
Modular Agent-Based Evolver (MABE) by Clifford Bohm, Arend Hintze, and others. ( Download code) | Direct or indirect encoding of , Neural Networks, genetic programming, and other arbitrarily customizable controllers. | Provides evolutionary algorithms, genetic programming algorithms, and allows customized algorithms, along with specification of arbitrary constraints. | Evolvable aspects include the neural model and allows for the evolution of morphology and sexual selection among others. |
Covariance Matrix Adaptation with Hypervolume Sorted Adaptive Grid Algorithm (CMA-HAGA) by Shahin Rostami, and others. (2025). 9781467389884 ISBN 9781467389884 | Direct, includes an atavism feature which enables traits to disappear and re-appear at different generations. | Multi-Objective Evolution Strategy with Preference Articulation (Computational Steering) | Structure, weights, and biases. |
GACNN evolutionary pressure-driven by Di Biasi et al, (2025). 9783031376603, Springer Nature Switzerland. ISBN 9783031376603 | Direct | Genetic algorithm, Single-Objective Evolution Strategy, specialized for Convolutional Neural Network | Structure |
Fast-DENSER by Assunção et al and others | Indirect | Grammatical evolution (Dynamic Structured Grammar Evolution) | Structure and optimiser used for training |
|
|