Given an environment which is fixed, actors in that environment with a variety of possible attributes or behaviors, and some goal or desired outcome: Observe a population of actors, evaluate them according to how well they perform, and spawn new actors whose characteristics are based on those who performed the best.
GA's are a means of updating settings in a system, and can be used with Neural Networks, or any other system.
The sequence is:
- Initial population: A population is generated from individuals with attribute parameters.
- Observation: The population interacts with the environment and data is collected about their ending state.
- Fitness function: The observations are graded in comparison to the desired outcome.
- Selection: Those individuals who performed best are selected.
- Crossover: Groups of selected individuals exchange some random set of attributes.
- Mutation: Random changes are introduced to avoid local minimum and ensure diversity.
This process is then repeated for many generations.
See also:
-
https://repl.it/@LoganBrown5/Evolution
A lovely, easy to understand, example of genetic training by Logan Brown. James Newton of MassMind replies: Slightly updated version with variable names, and a couple tweeks. "Dots live to walk some number of steps along their own path in a world with items that kill them, an entrance, and a goal. They evolve."