Search This Blog

Thursday 4 February 2016

Another failed Darwinian prediction V

Protein evolution:

Protein coding genes make up only a small fraction of the genome in higher organisms but their protein products are crucial to the operation of the cell. They are the workers behind just about every task in the cell, including digesting food, synthesizing chemicals, structural support, energy conversion, cell reproduction and making new proteins. And like a finely tuned machine, proteins do their work very well. Proteins are ubiquitous in all of life and must date back to the very early stages of evolution. So evolution predicts that proteins evolved when life first appeared, or not long after. But despite enormous research efforts the science clearly shows that such protein evolution is astronomically unlikely.

One reason the evolution of proteins is so difficult is that most proteins are extremely specific designs in an otherwise rugged fitness landscape. This means it is difficult for natural selection to guide mutations toward the needed proteins. In fact, four different studies, done by different groups and using different methods, all report that roughly 1070 evolutionary experiments would be needed to get close enough to a workable protein before natural selection could take over to refine the protein design. For instance, one study concluded that 1063 attempts would be required for a relatively short protein. (Reidhaar-Olson) And a similar result (1065 attempts required) was obtained by comparing protein sequences. (Yockey) Another study found that from 1064 to 1077 attempts are required (Axe) and another study concluded that 1070 attempts would be required. (Hayashi) In that case the protein was only a part of a larger protein which otherwise was intact, thus making for an easier search. Furthermore these estimates are optimistic because the experiments searched only for single-function proteins whereas real proteins perform many functions.

This conservative estimate of 1070 attempts required to evolve a simple protein is astronomically larger than the number of attempts that are feasible. And explanations of how evolution could achieve a large number of searches, or somehow obviate this requirement, require the preexistence of proteins and so are circular. For example, one paper estimated that evolution could have made 1043 such attempts. But the study assumed the entire history of the Earth is available, rather than the limited time window that evolution actually would have had. Even more importantly, it assumed the preexistence of a large population of bacteria (it assumed the earth was completely covered with bacteria). And of course, bacteria are full of proteins. Clearly such bacteria would not exist before the first proteins evolved. (Dryden) Even with these helpful and unrealistic assumptions the result was twenty seven orders of magnitude short of the requirement.

Given these several significant problems, the chances of evolution finding proteins from a random start are, as one evolutionist explained, “highly unlikely.” (Tautz) Or as another evolutionist put it, “Although the origin of the first, primordial genes may ultimately be traced back to some precursors in the so-called ‘RNA world’ billions of years ago, their origins remain enigmatic.” (Kaessmann)

References

Axe, D. 2004. “Estimating the prevalence of protein sequences adopting functional enzyme folds.” J Molecular Biology 341:1295-1315.

Dryden, David, Andrew Thomson, John White. 2008. “How much of protein sequence space has been explored by life on Earth?.” J. Royal Society Interface 5:953-956.

Hayashi, Y., T. Aita, H. Toyota, Y. Husimi, I. Urabe, T. Yomo. 2006. “Experimental Rugged Fitness Landscape in Protein Sequence Space.” PLoS ONE 1:e96.

Kaessmann, H. 2010. “Origins, evolution, and phenotypic impact of new genes.” Genome Research 10:1313-26.

Reidhaar-Olson J., R. Sauer. 1990. “Functionally acceptable substitutions in two alpha-helical regions of lambda repressor.” Proteins 7:306-316.

Tautz, Diethard, Tomislav Domazet-Lošo. 2011. “The evolutionary origin of orphan genes.” Nature Reviews Genetics 12:692-702.

Yockey, Hubert. 1977. “A calculation of the probability of spontaneous biogenesis by information theory.” J Theoretical Biology 67:377–398.

A Darwin of the gaps fallacy?

Francis Collins' Junk DNA Arguments Pushed Into Increasingly Small Gaps in Scientific Knowledge
Casey Luskin May 2, 2011 8:36 AM 

Recently I wrote an article explaining that both atheistic and theistic evolutionists have relied heavily on "junk DNA" -- specifically pseudogenes -- to argue against intelligent design (ID). In his 2006 book The Language of God, leading theistic evolutionist Francis Collins made such an argument, claiming that caspase-12 is a functionless pseudogene and asks, "why would God have gone to the trouble of inserting such a nonfunctional gene in this precise location?" (p. 139) Logan Gage and I responded citing research which suggested this purported "pseudogene" is functional in many humans. But Collins went much further in The Language of God. He claimed that huge portions of our genome are repetitive junk: "Mammalian genomes are littered with such AREs [ancient repetitive elements]" wrote Collins, "with roughly 45 percent of the human genome made up of such genetic flotsam and jetsam." (p. 136) Collins frames his argument in theological terms, writing: "Unless one is willing to take the position that God has placed these decapitated AREs in these precise positions to confuse and mislead us, the conclusion of a common ancestor for humans and mice is virtually inescapable." (pp. 136-137)Such arguments are dangerous for those who make them, because they are based upon our lack of knowledge of these types of DNA. They amount to "evolution of the gaps" reasoning--because as we learn more and more about biology, we're discovering more and more evidence of function for so-called "junk" DNA. The argument that much DNA is functionless junk, and thereby evidence for evolution, is relegated to gaps in our knowledge--gaps which are increasingly shrinking over time as science progresses.
But what if such DNA has function? If such DNA isn't functionless junk, this may be another instance where, in Collins' own words, a designer could have "used successful design principles over and over again." (p. 111) In fact, as explained in this rebuttal to Collins, multiple functions have been discovered for repetitive DNA:

In 2002, evolutionary biologist Richard Sternberg surveyed the literature and found extensive evidence for function in AREs. Sternberg's article concluded that "the selfish DNA narrative and allied frameworks must join the other 'icons' of neo-Darwinian evolutionary theory that, despite their variance with empirical evidence, nevertheless persist in the literature." Reprinted from Sternberg's paper, known genomic/epigenetic roles of REs include:
satellite repeats forming higher-order nuclear structures;
satellite repeats forming centromeres;
satellite repeats and other REs involved in chromatin condensation;
telomeric tandem repeats and LINE elements;
subtelomeric nuclear positioning/chromatin boundary elements;
non-TE interspersed chromatin boundary elements;
short, interspersed nuclear elements or SINEs as nucleation centers for methylation;
SINEs as chromatin boundary/insulator elements;
SINEs involved in cell proliferation;
SINEs involved in cellular stress responses;
SINEs involved in translation (may be connected to stress response);
SINEs involved in binding cohesion to chromosomes; and
LINEs involved in DNA repair.
Other genetic research continues to uncover functions for allegedly functionless types of repetitive DNA, including SINE, LINE, and ALU elements. Sternberg, along with leading geneticist James A. Shapiro, concludes elsewhere that "one day, we will think of what used to be called 'junk DNA' as a critical component of truly 'expert' cellular control regimes."

(Casey Luskin and Logan Gage, " A Reply to Francis Collins's Darwinian Arguments for Common Ancestry of Apes and Humans," in Intelligent Design 101: Leading Experts Explain the Key Issues (Kregel, 2008) (internal citations removed.)

Collins wrote The Language of God in 2006, and it makes a number of other "junk" DNA-type arguments for common ancestry between humans and other species, many of which are rebutted here, showing much evidence of function for such so-called "junk" DNA. Suffice to say, after a closer analysis many of Collins' "junk" DNA arguments for common ancestry turned out to be highly suspect, or simply inaccurate.

Collins Retreats on Junk DNA
Since writing The Language of God, Dr. Collins seems to have realized that it's potentially dangerous and inaccurate to argue that much non-coding DNA is junk. As Jonathan M. explains here, Collins takes much softer tone towards junk DNA in his 2010 book The Language of Life:

The discoveries of the past decade, little known to most of the public, have completely overturned much of what used to be taught in high school biology. If you thought the DNA molecule comprised thousands of genes but far more "junk DNA", think again.
(Francis Collins, The Language of Life: DNA and the Revolution in Personalized Medicine, pp. 5-6 (Harper, 2010).)

That sure sounds a lot different from his "45 percent of the human genome [is] made up of such genetic flotsam and jetsam" comment in 2006 in The Language of God. In fact, in his 2010 book, Collins seems to be strongly deemphasizing the amount of "junk" in the genome. Collins goes on to elaborate on just how much DNA isn't junk--disavowing the notion that even "gene deserts" are junk:

The exons and introns of protein-coding genes add up together to about 30 percent of the genome. Of that 30 percent, 1.5 percent are coding exons and 28.5 percent are removable introns. What about the rest? It appears there are also long "spacer" segments of DNA that lie between genes and that don't crowd for protein. In some instances, these regions extend across hundreds of thousands or even millions of base pairs, in which case they are referred to rather dismissively as "gene deserts." These regions are not just filler, however. They contain many of the signals that are needed to instruct a nearby gene about whether it should be on or off at a given developmental time in a given tissue. Furthermore, we are learning that there may be thousands of genes hanging out in these so-called deserts that don't code for protein at all. They are copied into RNA, but those RNA molecules are never translated--instead, they serve some other important functions.
(Francis Collins, The Language of Life: DNA and the Revolution in Personalized Medicine, p. 9 (Harper, 2010).)

But, it turns out Collins' Darwinian viewpoint won't allow him to completely divorce himself from junk-DNA thinking, as he makes an ambiguous statement that some unspecified portion of repetitive DNA remains junk:

Our genome is littered with repetitive sequences that have been inserted during a series of ancient assaults by various families of DNA parasites. Once they gain access to the genome, these "jumping genes" are capable of making copies of themselves, and then inserting those copies randomly throughout the genome. Roughly 50 percent of the human genome has had this history. However, in a nice demonstration of how natural selection can operate on all sorts of opportunities, a small fraction of these jumping genes have actually landed in a place where they have provided some advantage to the host. Thus, even some DNA we used to call "junk" is useful.
(Francis Collins, The Language of Life: DNA and the Revolution in Personalized Medicine, pp. 9-10 (Harper, 2010).)

If, according to Collins, natural selection explains function for junk-DNA, then perhaps this is a good example how Darwinian evolution both predicts we'll find junk, and also predicts that we won't find junk. Not exactly a very helpful theory. Moreover, as we saw above, much of the so-called repetitive "junk" is not functionless after all.

In the end, it's clear that Collins' 2010 book is a significant retreat on the claim that junk DNA dominates our genome. He even admits that noncoding DNA is "capable of carrying out a host of important functions":

It turns out that only about 1.5 percent of the human genome is involved in coding for protein. But that doesn't mean the rest is "junk DNA." A number of exciting new discoveries about the human genome should remind us not to become complacent in our understanding of this marvelous instruction book. For instance, it has recently become clear that there is a whole family of RNA molecules that do not code for protein. These so-called non-coding RNAs are capable of carrying out a host of important functions, including modifying the efficiency by which other RNAs are translated. In addition, our understanding of how genes are regulated is undergoing dramatic revision, as the signals embedded in the DNA molecule and the proteins that bind to them are rapidly being elucidated. The complexity of this network of regulatory information is truly mind-blowing, and has given rise to a whole new branch of biomedical research, sometimes referred to as "systems biology."
(Francis Collins, The Language of Life: DNA and the Revolution in Personalized Medicine, p. 293 (Harper, 2010).)

Collins Renews the Junk DNA Argument for Evolution
But Collins' 2010 book The Langauge of Life is not his most recent book. His most recent book is The Language of Science and Faith, co-written with Biologos vice-president Karl Giberson, and it once again focuses on making junk-DNA arguments for evolution.

Collins and Giberson look at the vitamin C GULO 'pseudogene' found in humans and other primates (as well as some nonprimate species), and they contend that it is "not remotely plausible" that "God inserted a piece of broken DNA into our genomes." They conclude that this "has established conclusively that the data fits a model of evolution from a common ancestor," but has "ruled out" common design. (Karl Giberson and Francis Collins, The Language of Science and Faith, p. 43 (InterVarsity Press, 2011).)

Giberson also cited this same pseudogene in a recent op-ed on CNN.com where he argued:

In particular, humans share an unfortunate "broken gene" with many other primates, including chimpanzees, orangutans, and macaques. ... How can different species have identical broken genes? The only reasonable explanation is that they inherited it from a common ancestor.
So it seems that Collins and Giberson still want to make junk-DNA arguments for evolution, but the discovery of function for so much so-called "junk" DNA in recent years has reduced them to citing a single example of a purported "broken" pseudogene in humans and other primates. Their gap in which to argue that noncoding DNA is "broken" has shrunk dramatically.

But as we've seen in recent posts, such as "Et tu, Pseudogenes? Another Type of 'Junk' DNA Betrays Darwinian Predictions," "Is 'Pseudogene' a Misnomer?," or "'Junk' RNA Found to Encode Peptides That Regulate Fruit Fly Development," the notion that pseudogenes are merely "broken DNA" is coming under heavy fire from new scientific discoveries. Jonathan Wells has a whole chapter discussing functions for pseudogenes in his new book The Myth of Junk DNA. It seems that the gap is becoming so small that not even pseudogenes are a safe argument for "junk" DNA anymore.

Conclusion
Francis Collins and Karl Giberson are choosing to rely quite heavily on the argument that pseudogenes are junk, "broken DNA." In fact, this singular pseudogene is their centerpiece evidence for common descent and macroevolution in their new book, The Language of Science and Faith. Giberson is so confident that this argument is right that in his recent CNN.com op-ed he's betting "Jesus would believe in evolution and so should you." But if history is to be our guide, then it would seem that this is a dangerous argument to make: The more we are learning about biology, genetics, and biochemistry, the more we are finding function for non-coding DNA, including pseudogenes.

Time will tell, but it's revealing that Giberson and Collins are reduced to citing smaller and smaller gaps in our knowledge as regards "junk" DNA to argue for evolution. Professor Giberson may boast that "Jesus would believe in evolution and so should you" -- but perhaps he should worry more about what direction the science is pointing rather than making religious arguments for evolution.

Is Macroevolution simply Microevolution squared?

Karl Giberson and Francis Collins Commit Berra's Blunder While Arguing for Macroevolution
Casey Luskin May 19, 2011 10:15 AM

In their new book The Language of Science and Faith, Karl Giberson and Francis Collins argue that "the distinction between micro and macro evolution is arbitrary." (p. 45, emphases in original) As a result, they assert that "macroevolution is simply microevolution writ large: add up enough small changes and we get a large change." (p. 45) What's most surprising is not that they make this claim (which is common in evolutionary writings), but the examples--or lack thereof--they give to back it up.
Their main illustration for macroevolution is the evolution of the automobile. "[N]obody could have imagined how Henry Ford's primitive T automobile could have turned into Toyota's Prius hybrid," they write, because "it would have been impossible for the engineers at Ford to develop all the remarkable engineering necessary to turn a Model T into a Prius in one year. The electronic enhancements alone took decades to invent and develop." (pp. 45-46)

Giberson and Collins have of course just committed what Phillip Johnson calls "Berra's blunder." Here's a snippet of Professor Berra's original blunder:

[I]f you compare a 1953 and a 1954 Corvette, side by side, then a 1954 and a 1955 model, and so on, the descent with modification is overwhelmingly obvious. ... the evidence is so solid and comprehensive that it cannot be denied by reasonable people.

(Tim Berra, Evolution and the Myth of Creationism, pp. 117-119 (Stanford University Press, 1990).)

If anything here is "overwhelmingly obvious," it's that Corvettes did not evolve by Darwinian mechanisms, but were intelligently designed. Phillip Johnson elaborates on Berra's blunder:

Of course, every one of those Corvettes was designed by engineers. The Corvette sequence - like the sequence of Beethoven's symphonies to the opinions of the United States Supreme Court - does not illustrate naturalistic evolution at all. It illustrates how intelligent designers will typically achieve their purposes by adding variations to a basic design plan. Above all, such sequences have no tendency whatever to support the claim that there is no need for a Creator, since blind natural forces can do the creating. On the contrary, they show that what biologists present as proof of "evolution" or "common ancestry" is just as likely to be evidence of common design.

(Phillip Johnson, Defeating Darwinism by Opening Minds, p. 63 (InterVarsity Press, 1997).)

The same goes for Prius hybrids: all of the innovations that led engineers to develop hybrids from Ford Model T's were intelligently designed, and did not arise by random mutations.

Direct Evidence Fails, So They Use Indirect Evidence
So the first argument used by Giberson and Collins to show that macroevolution is simply microevolution "writ large" seems to have failed. In fact, to their credit they acknowledge that "[w]e don't observe such macroevolutionary changes because they take such a long time" and therefore must use our "imaginations" to understand macroevolution. (pp. 46-47) Thus, they seek to provide indirect evidence of macroevolution, and in the next section ask, "Is there Proof of Macroevolution?"

The answer they provide, of course, is 'yes.' But guess what their evidence is? They fall back to again relying on pseudogenes: "The example of the broken vitamin C gene that we looked at earlier is a case in point" (p. 49), they write. As we already saw here and here, this is an incredibly weak argument, especially given that we're continually finding more and more functions for pseudogenes.

Big Claims, Small Evidence
Giberson and Collins claim that "[m]ountains of data arrive on a daily basis ... providing compelling evidence for macroevolution," (p. 49) but aside from a weak and assumption-based argument based upon a single pseudogene, Giberson and Collins do not specify exactly what that evidence is.

In a previous article we saw that Giberson and Collins essentially expected readers to take eye-evolution on faith. Now it seems that they also want their readers to take it on faith that "the distinction between micro and macro evolution is arbitrary," because they provide no empirical evidence to back up this claim other than a highly suspect and dangerous argument that a particular pseudogene is functionless "broken DNA."

Having failed to provide empirical data backing macroevolution, Giberson and Collns end their chapter on the evidence for evolution claiming: "All that evolution requires is enough generations to accumulate the sort of tiny differences that separate offspring from their parents and almost any transformation can be achieved." (p. 52)

Presto chango--evolution sounds so easy! But according to Darwin, evolution requires more than just "enough generations." Darwin acknowledged that evolution also requires a continuous evolutionary pathway:

If it could be demonstrated that any complex organ existed which could not possibly have been formed by numerous, successive, slight modifications, my theory would absolutely break down.

And it's here that macroevolution hits a wall, for there are many complex structures which cannot be built over "numerous, successive, slight modifications." Giberson and Collins promise to give "Straight Answers to Genuine Questions," but that's not the sort of analysis I am finding in their book The Language of Science and Faith.

Yet more on the supposed common ancestry of man and ape

More Points on ERVs
Jonathan M. May 28, 2011 10:00 AM

In my previous two articles ((here and here) ), I explored some of the background information concerning the integration of retroviral elements into primate genomes and the various arguments for common descent which are based on them. I explored, in some detail, the evidence for common descent based on the shared placement of retroviral sequences. In this final article, I will discuss the two remaining points which are raised in the popular-level article which I have been examining.

Shared Mutations?
Regarding shared "mistakes" between primate genomes, this argument again assumes that mutations are random and are unlikely to occur convergently. Cuevas et al. (2002), however, have documented, in retroviruses, the occurrence of molecular convergenes in 12 variable sites in independent lineages. Some of these convergent mutations even took place in intergenic regions (changes in which are normally thought to be selectively neutral) and also in synonymous sites. The authors also note that this observation is fairly widespread among HIV-1 virus clones in humans and in SHIV strains isolated from macaques, monkeys and humans.

As the authors note,
One of the most amazing features illustrated in Figure 1 is the large amount of evolutionary convergences observed among independent lineages. Twelve of the variable sites were shared by different lineages. More surprisingly, convergences also occurred within synonymous sites and intergenic regions. Evolutionary convergences during the adaptation of viral lineages under identical artificial environmental conditions have been described previously (Bull et al. 1997; Wichman et al. 1999; Fares et al. 2001). However, this phenomenon is observed not only in the laboratory. It is also a relatively widespread observation among human immunodeficiency virus (HIV)-1 clones isolated from patients treated with different antiviral drugs; parallel changes are frequent, often following a common order of appearance (Larder et al. 1991; Boucher et al. 1992; Kellam et al. 1994; Condra et al. 1996; Martinez-Picado et al. 2000). Subsequent substitutions may confer increasing levels of drug resistance or, alternatively, may compensate for deleterious pleiotropic effects of earlier mutations (Molla et al. 1996; Martinez-Picado et al. 1999; Nijhuis et al. 1999). Also, molecular convergences have been observed between chimeric simian-human immunodeficiency viruses (strain SHIV-vpu+) isolated from pig-tailed macaques, rhesus monkeys, and humans after either chronic infections or rapid virus passage (Hofmann-Lehmann et al. 2002).
I could cite several other similar studies. For another case example, see Bull et al. (1997).

LTRs And Phylogeny
The other argument offered by the article pertains to primate phylogenies in relation to long terminal repeat (LTR) sequences. Because LTRs are identical at the time of integration, it is argued, if the 5' and 3' LTR sequences are very different with respect to one another, this should correspond with an older insertion. The problem is that the pattern is nothing like as neat and tidy as many Darwinists would like us to think.

One of the main difficulties associated with trying to construct phylogenies based on the divergence between the 5' and 3' LTRs is that it is predicated on the critical supposition that the 5' and 3' LTR sequences are acquiring mutations independently of one another. However, the phenomenon of cross-LTR gene conversion can result in a much smaller degree of divergence, thereby rendering this method for inferring time since integration suspect.

The authors note,
We found that gene conversion plays a significant role in the molecular evolution of LTRs in primates and rodents, but the extent is quite different. In rodents, most LTRs are subject to extensive gene conversion that reduces the divergence, so that the divergence-based method results in a serious underestimation of the insertion time. In primates, this effect is limited to a small proportion of LTRs. The most likely explanation of the difference involves the minimum length of the interacting sequence (minimal efficient processing segment [MEPS]) for interlocus gene conversion. An empirical estimate of MEPS in human is 300-500 bp, which exceeds the length of most of the analyzed LTRs. In contrast, MEPS for mice should be much smaller. Thus, MEPS can be an important factor to determine the susceptibility of LTRs to gene conversion, although there are many other factors involved. It is concluded that the divergence method to estimate the insertion time should be applied with special caution because at least some LTRs undergo gene conversion. [emphasis added]
Summary
To summarise, we have observed over the last three blog posts that the case for primate common ancestry is not nearly as cut and dried as many evolutionary biologists would like to make out. While one can find a handful of ERVs which occupy the same loci, further inspection reveals that they are often independent events.


In the absence of a feasible naturalistic mechanism to account for how evolution from a common ancestor could have occurred, how can we be so sure that it did occur? In such a case, one ought to reasonably expect there to be some quite spectacular evidence for common ancestry. Unfortunately for Darwinists, however, the evidence for common ancestry is paper thin on the ground.

On the supposed common ancestry of man and ape.

Do Shared ERVs Support Common Ancestry?
Jonathan M. May 26, 2011 10:52 AM

In my  previous article, I discussed the background of one of the most commonly made arguments for primate common ancestry. In this article, I want to examine the first of the three layers of evidence offered by a popular-level article written about this subject.

The author of the article under discussion tells us,
When we examine the collective genome of Homo sapiens, we find that a portion of it consists of ERVs (IHGS Consortium, 2001). We also find that humans share most of them with Chimpanzees, as well as the other members of Hominidae (great apes), the members of Hylobatidae (gibbons), and even the members of Cercopitheciodae (old world monkeys) (Kurdyukov et al., 2001; Lebedev et al., 2000; Medstrand and Mager, 1998; Anderssen et al., 1997; Steinhuber et al., 1995). Since humans don't and/or can't regularly procreate and have fertile offspring with members of these species, and thus don't make sizable contributions to their gene pools, and vice versa, their inheritance cannot have resulted from unions of modern species. As previously mentioned, parallel integration is ruled out by the highly random target selection of integrase. And even if it was far more target-specific than observed, it would require so many simultaneous insertion and endogenizations that the evolutionary model would still be tremendously more parsimonious. This leaves only one way an ERV could have been inherited: via sexual reproduction of organisms of a species that later diverged into the one the organisms that share the ERV belong to, i.e. an ancestral species--simply put, humans and the other primates must share common ancestry.
Just how target-specific are these ERV integrations? In the portion of the article headed "common creationist responses," we are told that,
...while proviral insertion is not purely random, it is also not locus specific; due to the way it directly attacks the 5' and 3' phosphodiester bonds, with no need to ligate (Skinner et al., 2001). So relative to pure randomness, insertion is non-random, but relative to locus specificity, insertion is highly random.
Really?

Let's take a few moments to do what any good student of biology would do -- and briefly survey some of the literature.

In one relevant study, Barbulescu et al. (2001) report that,
We identified a human endogenous retrovirus K (HERV-K) provirus that is present at the orthologous position in the gorilla and chimpanzee genomes, but not in the human genome. Humans contain an intact preintegration site at this locus. [emphasis added]
It seems that the most plausible explanation for this is an independent insert in the gorilla and chimpanzee lineages. Notice that the intact preintegration site at the pertinent locus in humans precludes the possibility of the HERV-K provirus having been inserted into the genome of the common ancestor of humans, chimpanzees and gorillas, and subsequently lost from the human genome by processes of genetic recombination. Though there are other possible candidate hypotheses for this observation (such as incomplete lineage sorting), in the context of other indications of locus-specific site preference, this data is, at the very least, suggestive that these inserts may in fact be independent events.

But there's more.

Another study, by Sverdlov (1998) reports,

But although this concept of retrovirus selectivity is currently prevailing, practically all genomic regions were reported to be used as primary integration targets, however, with different preferences. There were identified 'hot spots' containing integration sites used up to 280 times more frequently than predicted mathematically. [emphasis added]
In addition,Yohn et al. (2005) report that,

Horizontal transmissions between species have been proposed, but little evidence exists for such events in the human/great ape lineage of evolution. Based on analysis of finished BAC chimpanzee genome sequence, we characterize a retroviral element (Pan troglodytes endogenous retrovirus 1 [PTERV1]) that has become integrated in the germline of African great ape and Old World monkey species but is absent from humans and Asian ape genomes.
I could continue in a similar vein for some time. Other classes of retroelement also show fairly specific target-site preferences. For example, Levy et al. (2009) report that Alu retroelements routinely preferentially insert into certain classes of already-present transposable elements, and do so with a specific orientation and at specific locations within the mobile element sequence. Moreover, a study published in Science by Li et al.(2009) found that, in the waterflea genome, introns routinely insert into the same loci, leading the internationally-acclaimed evolutionary biologist Michael Lynch to note,

Remarkably, we have found many cases of parallel intron gains at essentially the same sites in independent genotypes. This strongly argues against the common assumption that when two species share introns at the same site, it is always due to inheritance from a common ancestor.
Finally, Daniels and Deininger (1985) suggest that,

...a common mechanism exists for the insertion of many repetitive DNA families into new genomic sites. A modified mechanism for site-specific integration of primate repetitive DNA sequences is provided which requires insertion into dA-rich sequences in the genome. This model is consistent with the observed relationship between galago Type II subfamilies suggesting that they have arisen not by mere mutation but by independent integration events.
Such target-site preferences are also documented here, here, and here.

Why might these ERV site-preferences exist? Presumably because these sites are most conducive to their successful reproduction (e.g. the necessitude for expression of the ERV's regulatory elements; the activity of the host's DNA correction system, etc). Mitchell et al. (2004) suggest "that virus-specific binding of integration complexes to chromatin features likely guides site selection."

Out of tens of thousands of ERV elements in the human genome, roughly how many are known to occupy the same sites in humans and chimpanzees? According to this Talk-Origins article, at least seven. Let's call it less than a dozen. Given the sheer number of these retroviruses in our genome (literally tens of thousands), and accounting for the evidence of integration preferences and site biases which I have documented above, what are the odds of finding a handful of ERV elements which have independently inserted themselves into the same locus?

A Nested Hierarchy?
What about this "nested hierarchy" of which we are told?

We are (incorrectly) told that "There is only one, solitary known deviation of the distributional nested hierarchy; a relatively recently endogenized/fixed ERV called HERV-K-GC1."

This claim, however, is false.

In addition to the case mentioned, Yohn et al. (2005) report:
We performed two analyses to determine whether these 12 shared map intervals might indeed be orthologous. First, we examined the distribution of shared sites between species (Table S3). We found that the distribution is inconsistent with the generally accepted phylogeny of catarrhine primates. This is particularly relevant for the human/great ape lineage. For example, only one interval is shared by gorilla and chimpanzee; however, two intervals are shared by gorilla and baboon; while three intervals are apparently shared by macaque and chimpanzee. Our Southern analysis shows that human and orangutan completely lack PTERV1 sequence (see Figure 2A). If these sites were truly orthologous and, thus, ancestral in the human/ape ancestor, it would require that at least six of these sites were deleted in the human lineage. Moreover, the same exact six sites would also have had to have been deleted in the orangutan lineage if the generally accepted phylogeny is correct. Such a series of independent deletion events at the same precise locations in the genome is unlikely (Figure S3).

[...]

Several lines of evidence indicate that chimpanzee and gorilla PTERV1 copies arose from an exogenous source. First, there is virtually no overlap (less than 4%) between the location of insertions among chimpanzee, gorilla, macaque, and baboon, making it unlikely that endogenous copies existed in a common ancestor and then became subsequently deleted in the human lineage and orangutan lineage. Second, the PTERV1 phylogenetic tree is inconsistent with the generally accepted species tree for primates, suggesting a horizontal transmission as opposed to a vertical transmission from a common ape ancestor. An alternative explanation may be that the primate phylogeny is grossly incorrect, as has been proposed by a minority of anthropologists.

As irritating to the evolutionary model as it might be, there are, in fact, a significant number of deviations from the orthodox phylogeny.


In the final part of this blog series, I will discuss the argument based on "shared mistakes" in these ERV elements, as well as the argument based on degrees of mutational divergence between the retroviral 5' and 3' long terminal repeats (LTRs).

Mindless intelligence? or Darwinists' claim to have finally got that free lunch they've been seeking.

You Can't Ascribe Intelligence to an Unguided Process
Evolution News & Views February 4, 2016 3:55 AM

There's a new movement afoot among some evolutionary biologists to co-opt the term "intelligent design" for Darwin. It's not the same as Dawkins's famous line that "Biology is the study of complicated things that give the appearance of having been designed for a purpose." In this newer view, it's not just an appearance; it's a reality. The process of evolution itself is learning how to get smarter as it goes.

Richard A. Watson at the University of Southampton proposes this idea at The Conversation in a piece titled, "Intelligent design without a creator? Why evolution may be smarter than we thought." This is not your grandfather's Darwinism:

Charles Darwin's theory of evolution offers an explanation for why biological organisms seem so well designed to live on our planet. This process is typically described as "unintelligent" -- based on random variations with no direction. But despite its success, some oppose this theory because they don't believe living things can evolve in increments. Something as complex as the eye of an animal, they argue, must be the product of an intelligent creator.

I don't think invoking a supernatural creator can ever be a scientifically useful explanation. But what about intelligence that isn't supernatural? Our new results, based on computer modelling, link evolutionary processes to the principles of learning and intelligent problem solving -- without involving any higher powers. This suggests that, although evolution may have started off blind, with a couple of billion years of experience it has got smarter. [Emphasis added.]

With its provocative title, the article mischaracterizes intelligent design as a supernatural explanation involving "higher powers." In reality, ID is a scientific theory that only appeals to types of causes now known to be in operation, based on our uniform experience with causes that can produce complex specified information. Inferences to design can be made without any appeal to the "supernatural," as in the case of inferring intelligence as the cause of Mt. Rushmore.

That said, Watson does seem to have looked into ID, or at least the scholarship of our colleague, the science historian Michael Flannery. Watson writes:

Alfred Russel Wallace (who suggested a theory of natural selection at the same time as Darwin) later used the term "intelligent evolution" to argue for divine intervention in the trajectory of evolutionary processes. If the formal link between learning and evolution continues to expand, the same term could become used to imply the opposite.

That's an interesting connection, but as far as we're aware, it wasn't Wallace who introduced the term "intelligent evolution" in this context. Professor Flannery did so in the title of a book, seeking in a brief phrase to summarize Wallace's thinking and what it implies. Still, it's commendable that Watson appears to have done some extra reading to broaden his horizon.

What about the contention the evolution gets smarter over time? Watson's argument begins with the analogy of neural networks that "learn" to make connections that lead to greater rewards. Those, needless to say, are designed (see "Designless Logic: Is a Neural Net a Budding Brain?"). Can he make the transition to mindless processes, or is this another case of Darwin comparing artificial selection to natural selection?

But what about evolution, can it get better at evolving over time? The idea is known as the evolution of evolvability. Evolvability, simply the ability to evolve, depends on appropriate variation, selection and heredity -- Darwin's cornerstones. Interestingly, all of these components can be altered by past evolution, meaning past evolution can change the way that future evolution operates.

The notion of evolvability has been around for some time, Watson notes. In fact, Michael Behe has given a more rigorous definition of it in an article here at Evolution News. What's new in the notion of evolvability is the application of learning theory. Watson hopes this will give it a "much needed theoretical foundation." In his research, he has worked to compare genes in regulatory networks with synapses in neural networks.

Our work shows that the evolution of regulatory connections between genes, which govern how genes are expressed in our cells, has the same learning capabilities as neural networks. In other words, gene networks evolve like neural networks learn. While connections in neural networks change in the direction that maximises rewards, natural selection changes genetic connections in the direction that increases fitness. The ability to learn is not itself something that needs to be designed -- it is an inevitable product of random variation and selection when acting on connections.

The exciting implication of this is that evolution can evolve to get better at evolving in exactly the same way that a neural network can learn to be a better problem solver with experience. The intelligent bit is not explicit "thinking ahead" (or anything else un-Darwinian); it is the evolution of connections that allow it to solve new problems without looking ahead.

As an example of what he means, he discusses limbs. Random variation might change each limb separately, but if a regulatory network changed them all together, the next solution would be easier. Say, for instance, height would increase fitness. Having an upstream regulator change all four limbs together is an easier problem than changing them separately. This is how evolution could evolve to "learn" better ways to solve problems over time.

Watson is now ready to show how this kind of "intelligent design" is purely natural and requires no "divine intervention."

So, when an evolutionary task we guessed would be difficult (such as producing the eye) turns out to be possible with incremental improvement, instead of concluding that dumb evolution was sufficient after all, we might recognise that evolution was very smart to have found building blocks that make the problem look so easy.

To recap, Watson says that evolution is "smarter than we thought." It's not a clunky, blind opportunist tinkering at random. It can learn. It can find easier ways to increase fitness, and therefore get better at evolving over time. No intervening intelligence is required; as evolution learns, the organism becomes more evolvable. The more difficult problems (like arriving at an eye) are bound to be solved.

Notice, however, that Watson's "neural networks" are already designed entities. Once again, a Darwinian evolutionist has snuck information in the side door while talking about the magical power of material forces to produce rabbits from hats (see "Arrival of the Fittest: Natural Selection as an Incantation"). Gene regulatory networks, Stephen Meyer has shown, require more information to rewire. As for Watson's example of synchronized limb evolution, that's a post-hoc rationalization. If animals had four very different limbs, it's likely he would be ready with a good story about how evolution "learned" to do that.

If "learning" was a law of nature for material phenomena, we should expect to find all kinds of nonliving systems acting similarly. Picture a flow of water on a mildly sloping plain. The water will find the easiest way down, and will "learn" to flow that way, carving a channel deeper over time. But does that increase its "fitness"? Are the limestone terraces we discussed here more fit than random limestone blocks? Is wind that "learns" to drop sand grains on a dune more fit than wind that scatters sand across a beach? Intuitively, something seems amiss. This kind of thinking would make Mars more fit because of its canyons and dunes.

Information is a concept unfortunately lacking in Watson's proposal. Clearly, to build an animal from matter would require vast increases in information. The genome of a Cambrian body plan is extraordinarily more information-rich than that of amino acids in a primordial soup. A human brain, we recently pointed out, has the memory capacity of the World Wide Web. Can material substances "learn" to build libraries of complex specified information? Only in a Darwinist's dreams. Our uniform experience locates that ability in free-acting minds with real intelligence, not in material forces.

The only way Watson can tell his story is by personifying evolution, endowing it with learning ability. This is equivalent to calling a canyon smarter as it gets deeper, or the wind intelligent as it learns to pile sand higher. In all out experience, though, whenever we find a material entity employing complex specified information to act in an intelligent way (as in a computer or robot), we know that intelligence was its ultimate cause. It is a logical inference to ascribe an intelligent cause to intelligence in animals as well.


Unless they are willing to relegate their own intelligence to mindless material forces, advocates of the "evolvability" theory are well advised to avoid shooting themselves in the foot. How can Watson trust his own mind, if it is the product of mindless matter? Particles and forces are dumb. They do not act with goals, thinking through concepts to arrive at logical conclusions. They do not learn things. The concept of learning implies pre-existing intelligence.

Well past the slippery slope.

Brave New World, Here We Come
Wesley J. Smith February 2, 2016 10:17 AM

The West seems congenitally incapable of restraining Brave New World. In the cause of eliminating suffering by any means necessary, no technology -- including lethal -- is apparently off limits.

Now, the we-never-say-no UK Embryo Authority -- which once approved attempts at human cloning with cow eggs -- has told CRISPR gene editors to feel free to edit away. From the BBC story:

It is the first time a country has considered the DNA-altering technique in embryos and approved it.

The research will take place at the Francis Crick Institute in London and aims to provide a deeper understanding of the earliest moments of human life. It will be illegal for the scientists to implant the modified embryos into a woman.

But the field is attracting controversy over concerns it is opening the door to designer -- or GM -- babies.

"Concerns"? That is the ultimate point!

Nascent human life is considered so much clay to be researched upon and destroyed. That will lead to fetal farming one day, in which fetuses will be similarly treated and destroyed.


This is 21st-century eugenics -- and it will, in the end, become just as oppressive and anti-human equality as the original version.

  Ps. Dismaying but not surprising,the logic is that it is better to go light on the morals lest the money and brainpower gravitate to the competition.


Wednesday 3 February 2016

File under "Well said" XX

I hate war as only a soldier who has lived it can, only as one who has seen its brutality, its futility, its stupidity.
Dwight D. Eisenhower

The debate is done?: Pros and cons.

Is I.D true science?: Pros and Cons

The thumb print of Jehovah V

Poll shows teaching pros and cons of evolution uncontroversial with public.

For Darwin's Birthday, Poll Shows Broad Support for Teaching Evidence For and Against Darwin's Theory

 Evolution News & Views February 1, 2016 3:31 AM

 

 

Just in time for Charles Darwin's birthday on February 12, a new nationwide survey reveals that 81 percent of American adults believe that "when teaching Darwin's theory of evolution, biology teachers should cover both scientific evidence that supports the theory and scientific evidence critical of the theory."
Only 19 percent of Americans believe that "biology teachers should cover only scientific evidence that supports the theory."
"Americans agree by an overwhelming margin that students should learn about all of the scientific evidence relating to Darwinian evolution, pro and con," said Dr. John West, Vice President of Discovery Institute. "This is a common-sense approach. Most people understand that it's not good education to present a one-sided review of the data, especially in science."
"There is growing peer-reviewed research that questions the adequacy of the Darwinian mechanism of random mutation and natural selection," added Discovery Institute biologist Ann Gauger. Gauger holds a PhD in developmental biology from the University of Washington, and she has served in the past as a post-doctoral Fellow at Harvard University.
Support for teaching the scientific evidence for and against Darwin's theory is overwhelming regardless of age, gender, religious affiliation, geography, party affiliation, and household income.
  • 79 percent of men and 83 percent of women support teaching the evidence for and against Darwin's theory.

  • 85 percent of theists, 65 percent of atheists, and 79 percent of agnostics support this approach.

  • 79 percent of Democrats support teaching the evidence for and against Darwin's theory, and so do 82 percent of independents and 85 percent of Republicans.

  • 85 percent of middle-aged Americans (ages 45-59) support teaching the evidence for and against Darwin's theory, and so do 81 percent of young adults (ages 18-29) and senior citizens (ages 60 and older).
The poll was conducted by Discovery Institute using SurveyMonkey Audience, which randomly sampled the adult members of its nationally representative panel of more than 6 million U.S. residents. Survey responses were collected from January 5-9, 2016, and the survey included 2,117 completed responses for this question.
The SurveyMonkey platform has been utilized for public opinion surveys by NBC News, the Los Angeles Times, and other media organizations. More information on how SurveyMonkey Audience recruits respondents is available here.

Darwinism Vs. the real world XXVII

The Body as a Battlefield: Proteins of the Innate Immune System


Editor's note: Physicians have a special place among the thinkers who have elaborated the argument for intelligent design. Perhaps that's because, more than evolutionary biologists, they are familiar with the challenges of maintaining a functioning complex system, the human body. With that in mind, Evolution News is delighted to offer this series, "The Designed Body." For the complete series, see here. Dr. Glicksman practices palliative medicine for a hospice organization.

 Since life takes place in the context of nature, it must not only exist in accordance with physical and chemical laws, but must also protect itself from many of the organisms in its environment. There are a wide variety of microbes that our senses cannot detect and that are always trying to enter our body so they can multiply.
The first line of defense against infection by these microorganisms is the skin and the epithelial tissues that line the respiratory, gastrointestinal, and genitourinary tracts. Without any one of them, our earliest ancestors could not have survived long enough to reproduce. However, if by injury to the body or functional ability of the microorganism, the microbes penetrate into the tissues below, then they come up against the second line of defense: the immune system.
As we've seen already in this series, the immune system can be divided into two parts: the innate immune system that each of us is born with and the adaptive immune system that develops over time as we are exposed to the environment. Each of these systems has its unique cells and proteins, needed for the body to defend against microbial invasion. In my last two articles we looked at some of the more important immune cells of the innate system: the mast cells, macrophages, and dendritic cells, which are the first responders in the tissues, and the neutrophils that travel in the blood and respond to the signal to come to the battlefield. Now we will look at the proteins of the innate immune system and how they work to bring other immune cells to the field of battle, make neutrophils and macrophages more effective, and fight invading microbes.
The plasma proteins of innate immunity, which leak into the tissues when inflammation takes place, are collectively known as the complement system, or sometimes simply the complement, because they complement (complete) the function of its cellular components. The complement system consists of thirty or more proteins that, like the clotting factors, are mostly produced in the liver and enter the blood in an inactive form.
Also, just as with clotting, there is more than one pathway for activation and once it begins, it progresses quickly in a cascading fashion, like falling dominoes. Finally, just as with the coagulation cascade, activation of the complement system requires that two key enzymatic steps take place to unleash its power. Since inappropriate activation of the complement can result in significant injury, the body must make sure that it only turns on when it's needed and stays or turns off when it's not.
Just as the final common pathway for coagulation involves mainly two clotting factors (prothrombin and fibrinogen), so too, activation of the complement system involves mainly two complement proteins called C3 and C5. There are thought to be three chemical pathways by which foreign molecules on the surface of invading microbes triggers complement activation.
All three of these pathways converge to form an enzyme called C3 convertase. C3 convertase, as its name implies, is an enzyme that breaks specific bonds within hundreds of molecules of C3 and converts them into two fragments called C3a and C3b. (It sounds like the scientists who came up with these names must have been brought up on Dr. Seuss's book The Cat in the Hat Comes Back. Remember Little Cats C, D, E, et al.?)
The smaller fragment, C3a, binds to specific receptors on mast cells, which trigger them to release histamine to bring about inflammation and call more immune cells and proteins to the battlefield. The larger fragment, C3b, usually does one of two things. It can attach to foreign proteins on microbes, allowing neutrophils and macrophages to better identify and attach to them by using specific complement receptors. Then they can engulf and digest them or it can join with C3 convertase to form another enzyme called C5 convertase, which breaks C5 into two fragments called C5a and C5b.
Like C3a, C5a, the smaller fragment, triggers inflammation by attaching to complement receptors on mast cells to release chemicals like histamine. C5a also helps neutrophils and monocytes (macrophages), pass through the capillaries and attracts them to the field of battle by chemotaxis. The larger fragment, C5b, acts as an anchor to which several specific complement proteins attach to form what is called the Membrane Attack Complex (MAC). The MAC is a weapon made up of these complement proteins that literally drills a hole through the cell membrane of the microbe to kill it.
However, just as in clotting, where inappropriate activation of the system is very problematic, so too the body must be able to control the explosive power of the complement system. To control hemostasis, the body has to have enough anti-clotting factors that can resist coagulation unless significant injury and bleeding takes place. Here as well, to control the activation of the complement, the body has to have enough inhibiting proteins to resist the formation of both C3 and C5 convertase unless a significant infection is present.
When activated, the proteins of the complement system provide the body's immune defense with significant assistance and firepower to fight against resistant pathogenic microbes. Activated complement proteins increase inflammation (C3a, C5a), attract phagocytes to the battlefield (C5a), help them attach to microbes for phagocytosis (C3b), and directly kill microbes (C5b, MAC).
In addition, to prevent tissue damage, the body must have enough inhibiting proteins so that the complement only turns on when it's needed and stays or turns off when it's not. Deficiency of a specific complement protein or one of their inhibitors is rare and usually manifests as either recurrent infection or serious allergic or autoimmune disease. This means that if our earliest ancestors hadn't had enough of most of the proteins that make up the complement system, they never could have survived long enough to reproduce.
Evolutionary biologists observe that certain of the components of the complement system are present in some earlier forms of life and they conclude that its development can be explained by gene duplication. However, not only is the system irreducibly complex, requiring all of the parts to work properly, but there has to be enough of each of the components and their inhibitors as well.
In other words, the body requires a natural survival capacity to produce enough of each component, the control of which evolutionary biologists can't explain and neither can medical science. Now that you know the components of the innate immune system and how they work together to help defend the body from infection, we'll look at the adaptive immune system.

Monday 25 January 2016

Using design to debunk design?

Intelligent Design Lab is Going Where no Evolution Simulation has Gone Before:
Robert Crowther June 10, 2008 8:46 AM |

Over the past decade or so there has been much hype about computer simulations of Darwinian evolution. The most hyped is Avida at the MSU Digital Evolution Laboratory. Avida researchers claim their work is not a simulation, but actually is Darwinian evolution in action. They describe it like this:

In Avida, a population of self-replicating computer programs is subjected to external pressures (such as mutations and limited resources) and allowed to evolve subject to natural selection. This is not a mere simulation of evolution -- digital organisms in Avida evolve to survive in a complex computational environment and will adapt to perform entirely new traits in ways never expected by the researchers, some of which seem highly creative.
According to MSU's Robert Pennock: "Avida is not a simulation of evolution; it is an instance of it."
You can't ignore the fact that ...

... previous computer simulations of evolution, such as Avida, were carefully proscribed and tightly constrained by the environment created by its programmers. For example Avida shows how organisms can advance in an environment where they are solving problems, but problems that were set up for them to solve. The digital organisms produced there can only do so much or go so far as they are constrained by the environment the programmer has designed.

Should the Avida team be working in quarantine? Lenski argues that Avida itself acts as a quarantine, because its organisms can exist only in its computer language. "They're living in an alien world," Lenski says. "They may be nasty predators from Mars, but they'd drop dead here." Life is a different environment than that programmed for Avida's digital organisms.
Enter the new evolution computational software just released by Biologic Institute. The program, Stylus, was developed by molecular biologist Douglas Axe and software engineer Brendan Dixon, and announced last week in a peer-reviewed publication at PLos One.

Stylus however goes way beyond previous computer simulations. Axe describes it this way:

Like the structures of life, the structures of language are used to solve real problems at a high level. And the high level solutions in both worlds depend on a succession of solutions at lower levels.
In life, body plans serve the needs of particular modes of life, organs serve the needs of particular body plans, tissues serve the needs of particular organs, cells serve the needs of particular tissues, protein functions serve the needs of particular cells, protein structures serve the needs of particular protein functions, protein sequences serve the needs of particular structures, and genes serve the needs of these particular protein sequence requirements.


In a similarly hierarchical way, texts of various kinds serve the needs of particular communication objectives, sections serve the needs of particular texts, paragraphs serve the needs of particular sections, sentences serve the needs of particular paragraphs, phrases serve the needs of particular sentences, and words serve the needs of particular phrases.

What about letters serving the needs of words? Well, the problem with letter-based texts is that they are only sequences, whereas structures figure prominently in the functions of proteins. Protein sequences must form functional three-dimensional structures in order to work, whereas alphabetic sequences function directly as sequences.

But not all written languages are alphabetic. Chinese writing, in particular, employs structural characters that are analogous in some interesting ways to protein structures. Like folded proteins, these written characters perform the low level functions from which higher functions can be achieved.

Why is this important? Well, for one thing, if realism is important it shows how far Avida falls short as an "instance of evolution." And for another thing, it is going to open new avenues of research into how much or how little organisms can evolve and whether it really is possible to go from the simplest building blocks of life to the more complex and necessary functions of life without any guiding intelligence at all.

Avida, Stylus. Stylus, Avida. Out with the old, in with the new.

A clash of titans VI