The Potential of Genetic Algorithms
Genetic Algorithms acquire cognitive capabilities by using the Darwinian idea of survival of the fittest. Each individual has a fitness determined by some predefined criteria. The individuals are checked for fitness. Then allowed to reproduce and mutate. Then the process is repeated. The initial population is randomly created under certain conditions. The basic genetic algorithm structure is like this.
How would you created these features?
In bit array, a random individual would be a random sequence of True and False, but in a program it would be a series of instructions. The answer to this question, as well as the answer of how to produce offspring, and mutated individual lies in the structure of the individual. The defining genetic material of the individual is commonly refered to as the chromsome continuing our biological naming trend.
How do we choose who gets to reproduce? Sexual Reproduction
Individuals are first evaluated for their fitness. Then the fitness vector of the current generation is normalized and individuals are choosen in proportion to their normal values. This process allows individuals who are more fit to reproduce more often. By selecting fitter individuals at a higher rate than lesser individuals the population tends to become healthier with time. This trend towards increased fitness is what gives genetic algorithms their cognitive capabilities. It also easy to see how Darwin's ideas effect the design of our algorithm. The fitter individuals survive better their lesser counterparts allowing the population to evolve to a more appropriate solution. Individuals generally reproduce with no population growth. Each pair of parents produces two offspring. The number individuals is held constant with time.
Why do we need mutation? Assexual Reproduction
Mutation allows the algorithm to escape from early problems. If a weak individually accidently reproduces at a higher rate than expected then the population is weakened. Mutation allows us to escape these problems by randomly changing an individual at a small rate (usually less than 1 in 1000). High mutation rates are not usually advantageous; however, a high mutation rate (about .01) might be advantageous in a dynamic environment. Some genetic algorithms decrease the mutation rate with time. This allows the algorithm to move quickly to the vicinity of the absolute maximum and then as the mutation rate decreases move increasingly close to the absolute maximum. This scheme can improve the convergence time of the algorithm.
What are some drawbacks of genetic algorithms?
Genetic algorithms despite their strengths also have their weaknesses. Genetic algorithms are not guaranteed to converge to the proper solution to a problem. There success is governed by a logarithmic formula based on the propability of randomly producing a perfect individual. This is not usually a reason for much concern. Genetic algorithms still generally converge more often than Neural Networks or at least with less difficulty.