Missing the Goal: Realistic Mutation Rates Stop Evolutionary Algorithms
Ann Gauger November 17, 2015 7:29 AM
Winston Ewert of Biologic Institute has just published a new article in the peer-reviewed journal BIO-Complexity ("Overabundant mutations help potentiate evolution: The effect of biologically realistic mutation rates on computer models of evolution").
He and his colleagues have been engaged in a series of critiques of evolutionary algorithms for the last several years. In case you don't know what an evolutionary algorithm is, it's a computer model that seeks to represent evolution in some way, so that mutation and natural selection can be tested for their ability to produce meaningful change.
The advantage of these computer simulations is that they can be run many, many times and thus approximate the long time necessary for biological evolution. The disadvantage is that they do not replicate true biological evolutionary processes, but use "analogous" algorithms. Typically these models, such as Ev and Avida, are purported to solve complex problems.
Yet Ewert and his colleagues have shown that in every case the necessary information for the models to find their targets was smuggled in, whether intentionally or not, by the respective programmers. You can read some of Ewert and coauthors' critiques here, here, and here.
Here is the abstract of the new paper:
Various existing computer models of evolution attempt to demonstrate the efficacy of Darwinian evolution by solving simple problems. These typically use per-nucleotide (or nearest analogue) mutation rates orders of magnitude higher than biological rates. This paper compares models using typical rates for genetic algorithms with the same models using a realistic mutation rate. It finds that the models with the realistic mutation rates lose the ability to solve the simple problems. This is shown to be the result of the difficulty of evolving mutations that only provide a benefit in combination with other mutations.
Ewert shows that even taking the models as they are, when they are tested using realistic scenarios they fail to accomplish their goals. In fact, they accomplish little beyond their starting positions. He determines the reason for this failure -- the models can only go as far as one step will take them. They can't evolve anything that requires two or more mutations, unless mutation rates are unrealistically high.
This is a showstopper, since making just about anything new requires more than one mutation. Take a look..
Ann Gauger November 17, 2015 7:29 AM
Winston Ewert of Biologic Institute has just published a new article in the peer-reviewed journal BIO-Complexity ("Overabundant mutations help potentiate evolution: The effect of biologically realistic mutation rates on computer models of evolution").
He and his colleagues have been engaged in a series of critiques of evolutionary algorithms for the last several years. In case you don't know what an evolutionary algorithm is, it's a computer model that seeks to represent evolution in some way, so that mutation and natural selection can be tested for their ability to produce meaningful change.
The advantage of these computer simulations is that they can be run many, many times and thus approximate the long time necessary for biological evolution. The disadvantage is that they do not replicate true biological evolutionary processes, but use "analogous" algorithms. Typically these models, such as Ev and Avida, are purported to solve complex problems.
Yet Ewert and his colleagues have shown that in every case the necessary information for the models to find their targets was smuggled in, whether intentionally or not, by the respective programmers. You can read some of Ewert and coauthors' critiques here, here, and here.
Here is the abstract of the new paper:
Various existing computer models of evolution attempt to demonstrate the efficacy of Darwinian evolution by solving simple problems. These typically use per-nucleotide (or nearest analogue) mutation rates orders of magnitude higher than biological rates. This paper compares models using typical rates for genetic algorithms with the same models using a realistic mutation rate. It finds that the models with the realistic mutation rates lose the ability to solve the simple problems. This is shown to be the result of the difficulty of evolving mutations that only provide a benefit in combination with other mutations.
Ewert shows that even taking the models as they are, when they are tested using realistic scenarios they fail to accomplish their goals. In fact, they accomplish little beyond their starting positions. He determines the reason for this failure -- the models can only go as far as one step will take them. They can't evolve anything that requires two or more mutations, unless mutation rates are unrealistically high.
This is a showstopper, since making just about anything new requires more than one mutation. Take a look..
No comments:
Post a Comment