The theory of evolution by natural selection, as described by Darwin and Wallace, is simple and elegant. The organism best fitted for its environment is most likely to survive to pass on its characteristics to the next generation.
The key to understanding how this happens is to look at the role of genes, the markers in the DNA within each cell that contain instructions for how an organism will grow and develop. When a male and female breed the genes for the offspring are made up of a mix of both parents genes, a process called 'crossover'. In this way the genetic pot is stirred with each new generation to produce new sets of genes across a population.
There is another crucial factor at work, too.
Every so often a gene will be copied incorrectly with some of the information being changed in subtle and random ways. Most of the time, these mutations will have no effect; much of the genome is made up of so called 'junk' DNA. Sometimes they will be harmful and the creature will not survive. Very rarely, a creature will roll a metaphorical double six and end up with an advantage such as a slightly longer beak, and the forces of natural selection then come into play.
Computer scientists in the ‘60s looked at this mechanism and began to find ways of using it to solve complex problems that were beyond the reach of traditional ‘brute force’ programming techniques.
Possible solutions can be described as a series of binary bits, with an initial population being randomly created. The fitness of each individual in the population is measured, and the strongest given preference for forming the next generation using a mechanism known as a ‘biased roulette wheel’ – in effect giving them a better chance of being chosen by having a bigger slot. The best elements of each winning individual are combined and the same process applied as many times as necessary.
The random element is still just as important as in nature, though.
Imagine that you are trying to find the best route to the highest point in a hilly landscape. A reasonable strategy might be to always take the path leading uphill when given a choice. You could easily end up stranded on a smaller hill with only downward paths into a valley and miss a route that could find a better solution.
The answer is to add the chance of mutations occurring to the genetic algorithms. This is as simple as flipping a random bit in an individual to alter its genetic makeup. There is a chance of possibly spoiling a good gene, but it is equally possible that something better may emerge. It is all a question of finding the right balance between random chance and survival of the fittest.
A genetic algorithm with the chance of taking a random leap of faith and heading downhill may well end up finding the way to the mountaintop.