Search This Blog

Tuesday, 2 May 2017

Why slain myths become undead rather than stay buried.

Who Will Debunk The Debunkers?
By Daniel Engber


In 2012, network scientist and data theorist Samuel Arbesman published a disturbing thesis: What we think of as established knowledge decays over time. According to his book “The Half-Life of Facts,” certain kinds of propositions that may seem bulletproof today will be forgotten by next Tuesday; one’s reality can end up out of date. Take, for example, the story of Popeye and his spinach.

Popeye loved his leafy greens and used them to obtain his super strength, Arbesman’s book explained, because the cartoon’s creators knew that spinach has a lot of iron. Indeed, the character would be a major evangelist for spinach in the 1930s, and it’s said he helped increase the green’s consumption in the U.S. by one-third. But this “fact” about the iron content of spinach was already on the verge of being obsolete, Arbesman said: In 1937, scientists realized that the original measurement of the iron in 100 grams of spinach — 35 milligrams — was off by a factor of 10. That’s because a German chemist named Erich von Wolff had misplaced a decimal point in his notebook back in 1870, and the goof persisted in the literature for more than half a century.

By the time nutritionists caught up with this mistake, the damage had been done. The spinach-iron myth stuck around in spite of new and better knowledge, wrote Arbesman, because “it’s a lot easier to spread the first thing you find, or the fact that sounds correct, than to delve deeply into the literature in search of the correct fact.”

Arbesman was not the first to tell the cautionary tale of the missing decimal point. The same parable of sloppy science, and its dire implications, appeared in a book called “Follies and Fallacies in Medicine,” a classic work of evidence-based skepticism first published in 1989.1 It also appeared in a volume of “Magnificent Mistakes in Mathematics,” a guide to “The Practice of Statistics in the Life Sciences” and an article in an academic journal called “The Consequence of Errors.” And that’s just to name a few.

All these tellings and retellings miss one important fact: The story of the spinach myth is itself apocryphal. It’s true that spinach isn’t really all that useful as a source of iron, and it’s true that people used to think it was. But all the rest is false: No one moved a decimal point in 1870; no mistake in data entry spurred Popeye to devote himself to spinach; no misguided rules of eating were implanted by the sailor strip. The story of the decimal point manages to recapitulate the very error that it means to highlight: a fake fact, but repeated so often (and with such sanctimony) that it takes on the sheen of truth.

In that sense, the story of the lost decimal point represents a special type of viral anecdote or urban legend, one that finds its willing hosts among the doubters, not the credulous. It’s a rumor passed around by skeptics — a myth about myth-busting. Like other Russian dolls of distorted facts, it shows us that, sometimes, the harder that we try to be clear-headed, the deeper we are drawn into the fog.


No one knows this lesson better than Mike Sutton. He must be the world’s leading meta-skeptic: a 56-year-old master sleuth who first identified the myth about the spinach myth in 2010 and has since been working to debunk what he sees as other false debunkings. Sutton, a criminology professor at Nottingham Trent University, started his career of doubting very young: He remembers being told when he was still a boy that all his favorite rock stars on BBC’s “Top of the Pops” were lip-synching and that some weren’t even playing their guitars. Soon he began to wonder at the depths of this deception. Could the members of Led Zeppelin be in on this conspiracy? Was Jimmy Page a lie? Since then, Sutton told me via email, “I have always been concerned with establishing the veracity of what is presented as true, and what is something else.”

As a law student, Sutton was drawn to stories like that of Popeye and the inflated iron count in spinach, which to him demonstrated both the perils of “accepted knowledge” and the importance of maintaining data quality. He was so enamored of the story, in fact, that he meant to put it in an academic paper. But in digging for the story’s source, he began to wonder if it was true. “It drew me in like a problem-solving ferret to a rabbit hole,” he said.

Soon he’d gone through every single Popeye strip ever drawn by its creator, E.C. Segar, and found that certain aspects of the classic story were clearly false. Popeye first ate spinach for his super power in 1931, Sutton found, and in the summer of 1932 the strip offered this iron-free explanation: “Spinach is full of vitamin ‘A,’” Popeye said, “an’ tha’s what makes hoomans strong an’ helty.” Sutton also gathered data on spinach production from the U.S. Department of Agriculture and learned that it was on the rise before Segar’s sailor-man ever starting eating it.

It seems plausible that the tellers of these tales are getting blinkered by their own feelings of superiority — that the mere act of busting myths makes them more susceptible to spreading them.
What about the fabled decimal point? According to Sutton’s research, a German chemist did overestimate the quantity of iron in spinach, but the mistake arose from faulty methods, not from poor transcription of the data.2 By the 1890s, a different German researcher had concluded that the earlier estimate was many times too high. Subsequent analyses arrived at something closer to the correct, still substantial value — now estimated to be 2.71 milligrams of iron per 100 grams of raw spinach, according to the USDA. By chance, the new figure was indeed about one-tenth of the original, but the difference stemmed not from misplaced punctuation but from the switch to better methodology. In any case, it wasn’t long before Columbia University analytical chemist Henry Clapp Sherman laid out the problems with the original result. By the 1930s, Sutton argues, researchers knew the true amount of iron in spinach, but they also understood that not all of it could be absorbed by the human body.3

The decimal-point story only came about much later. According to Sutton’s research, it seems to have been invented by the nutritionist and self-styled myth-buster Arnold Bender, who floated the idea with some uncertainty in a 1972 lecture. Then in 1981, a doctor named Terence Hamblin wrote up a version of the story without citation for a whimsical, holiday-time column in the British Medical Journal. The Hamblin article, unscholarly and unsourced, would become the ultimate authority for all the citations that followed. (Hamblin graciously acknowledged his mistake after Sutton published his research, as did Arbesman.)

In 2014, a Norwegian anthropologist named Ole Bjorn Rekdal published an examination of how the decimal-point myth had propagated through the academic literature. He found that bad citations were the vector. Instead of looking for its source, those who told the story merely plagiarized a solid-sounding reference: “(Hamblin, BMJ, 1981).” Or they cited someone in between — someone who, in turn, had cited Hamblin. This loose behavior, Rekdal wrote, made the transposed decimal point into something like an “academic urban legend,” its nested sourcing more or less equivalent to the familiar “friend of a friend” of schoolyard mythology.

Emerging from the rabbit hole, Sutton began to puzzle over what he’d found. This wasn’t just any sort of myth, he decided, but something he would term a “supermyth”: A story concocted by respected scholars and then credulously disseminated in order to promote skeptical thinking and “to help us overcome our tendency towards credulous bias.” The convolution of this scenario inspired him to look for more examples. “I’m rather a sucker for such complexity,” he told me.


Complicated and ironic tales of poor citation “help draw attention to a deadly serious, but somewhat boring topic,” Rekdal told me. They’re grabby, and they’re entertaining. But I suspect they’re more than merely that: Perhaps the ironies themselves can help explain the propagation of the errors.

It seems plausible to me, at least, that the tellers of these tales are getting blinkered by their own feelings of superiority — that the mere act of busting myths makes them more susceptible to spreading them. It lowers their defenses, in the same way that the act of remembering sometimes seems to make us more likely to forget. Could it be that the more credulous we become, the more convinced we are of our own debunker bona fides? Does skepticism self-destruct?


Sutton told me over email that he, too, worries that contrarianism can run amok, citing conspiracy theorists and anti-vaxxers as examples of those who “refuse to accept the weight of argument” and suffer the result. He also noted the “paradox” by which a skeptic’s obsessive devotion to his research — and to proving others wrong — can “take a great personal toll.” A person can get lost, he suggested, in the subterranean “Wonderland of myths and fallacies.”

In the last few years, Sutton has himself embarked on another journey to the depths, this one far more treacherous than the ones he’s made before. The stakes were low when he was hunting something trivial, the supermyth of Popeye’s spinach; now Sutton has been digging in more sacred ground: the legacy of the great scientific hero and champion of the skeptics, Charles Darwin. In 2014, after spending a year working 18-hour days, seven days a week, Sutton published his most extensive work to date, a 600-page broadside on a cherished story of discovery. He called it “Nullius in Verba: Darwin’s Greatest Secret.”

Sutton’s allegations are explosive. He claims to have found irrefutable proof that neither Darwin nor Alfred Russel Wallace deserves the credit for the theory of natural selection, but rather that they stole the idea — consciously or not — from a wealthy Scotsman and forest-management expert named Patrick Matthew. “I think both Darwin and Wallace were at the very least sloppy,” he told me. Elsewhere he’s been somewhat less diplomatic: “In my opinion Charles Darwin committed the greatest known science fraud in history by plagiarizing Matthew’s” hypothesis, he told the Telegraph. “Let’s face the painful facts,” Sutton also wrote. “Darwin was a liar. Plain and simple.”

Some context: The Patrick Matthew story isn’t new. Matthew produced a volume in the early 1830s, “On Naval Timber and Arboriculture,” that indeed contained an outline of the famous theory in a slim appendix. In a contemporary review, the noted naturalist John Loudon seemed ill-prepared to accept the forward-thinking theory. He called it a “puzzling” account of the “origin of species and varieties” that may or may not be original. In 1860, several months after publication of “On the Origin of Species,” Matthew would surface to complain that Darwin — now quite famous for what was described as a discovery born of “20 years’ investigation and reflection” — had stolen his ideas.

Darwin, in reply, conceded that “Mr. Matthew has anticipated by many years the explanation which I have offered of the origin of species, under the name of natural selection.” But then he added, “I think that no one will feel surprised that neither I, nor apparently any other naturalist, had heard of Mr. Matthew’s views.”

That statement, suggesting that Matthew’s theory was ignored — and hinting that its importance may not even have been quite understood by Matthew himself — has gone unchallenged, Sutton says. It has, in fact, become a supermyth, cited to explain that even big ideas amount to nothing when they aren’t framed by proper genius.

Sutton thinks that story has it wrong, that natural selection wasn’t an idea in need of a “great man” to propagate it. After all his months of research, Sutton says he found clear evidence that Matthew’s work did not go unread. No fewer than seven naturalists cited the book, including three in what Sutton calls Darwin’s “inner circle.” He also claims to have discovered particular turns of phrase — “Matthewisms” — that recur suspiciously in Darwin’s writing.

In light of these discoveries, Sutton considers the case all but closed. He’s challenged Darwin scholars to debates, picked fights with famous skeptics such as Michael Shermer and Richard Dawkins, and even written letters to the Royal Society, demanding that Matthew be given priority over Darwin.

But if his paper on the spinach myth convinced everyone who read it — even winning an apology from Terence Hamblin, one of the myth’s major sources — the work on Darwin barely registered. Many scholars ignored it altogether. A few, such as Michael Weale of King’s College, simply found it unconvincing. Weale, who has written his own book on Patrick Matthew, argued that Sutton’s evidence was somewhat weak and circumstantial. “There is no ‘smoking gun’ here,” he wrote, pointing out that at one point even Matthew admitted that he’d done little to spread his theory of natural selection. “For more than thirty years,” Matthew wrote in 1862, he “never, either by the press or in private conversation, alluded to the original ideas … knowing that the age was not suited for such.”


When Sutton is faced with the implication that he’s taken his debunking too far — that he’s tipped from skepticism to crankery — he lashes out. “The findings are so enormous that people refuse to take them in,” he told me via email. “The enormity of what has, in actual fact, been newly discovered is too great for people to comprehend. Too big to face. Too great to care to come to terms with — so surely it can’t be true. Only, it’s not a dream. It is true.” In effect, he suggested, he’s been confronted with a classic version of the “Semmelweis reflex,” whereby dangerous, new ideas are rejected out of hand.

Could Sutton be a modern-day version of Ignaz Semmelweis, the Hungarian physician who noticed in the 1840s that doctors were themselves the source of childbed fever in his hospital’s obstetric ward? Semmelweis had reduced disease mortality by a factor of 10 — a fully displaced decimal point — simply by having doctors wash their hands in a solution of chlorinated lime. But according to the famous tale, his innovations were too radical for the time. Ignored and ridiculed for his outlandish thinking, Semmelweis eventually went insane and died in an asylum. Arbesman, author of “The Half-Life of Facts,” has written about the moral of this story too. “Even if we are confronted with facts that should cause us to update our understanding of the way the world works,” he wrote, “we often neglect to do so.”

Of course, there’s always one more twist: Sutton doesn’t believe this story about Semmelweis. That’s another myth, he says — another tall tale, favored by academics, that ironically demonstrates the very point that it pretends to make. Citing the work of Sherwin Nuland, Sutton argues that Semmelweis didn’t go mad from being ostracized, and further that other physicians had already recommended hand-washing in chlorinated lime. The myth of Semmelweis, says Sutton, may have originated in the late 19th century, when a “massive nationally funded Hungarian public relations machine” placed biased articles into the scientific literature. Semmelweis scholar Kay Codell Carter concurs, at least insofar as Semmelweis was not, in fact, ignored by the medical establishment: From 1863 through 1883, he was cited dozens of times, Carter writes, “more frequently than almost anyone else.”

Yet despite all this complicating evidence, scholars still tell the simple version of the Semmelweis story and use it as an example of how other people — never them, of course — tend to reject information that conflicts with their beliefs. That is to say, the scholars reject conflicting information about Semmelweis, evincing the Semmelweis reflex, even as they tell the story of that reflex. It’s a classic supermyth!

And so it goes, a whirligig of irony spinning around and around, down into the depths. Is there any way to escape this endless, maddening recursion? How might a skeptic keep his sanity? I had to know what Sutton thought. “I think the solution is to stay out of rabbit holes,” he told me. Then he added, “Which is not particularly helpful advice.”

Footnotes

Its authors cite the story of the misplaced decimal point as an example of the “Bellman’s Fallacy” — a reference to a character from Lewis Carroll who says, “What I tell you three times is true.” Such mistakes, they wrote, illustrate “the ways in which truth may be obscured, twisted, or mangled beyond recognition, without any overt intention to do it harm.” ^
Another scholar with an interest in the spinach tale has found that in Germany, at least, the link between spinach and iron was being cited as conventional wisdom as early as 1853. This confusion may have been compounded by research that elided differences between dried and fresh spinach, Sutton says. ^
It’s long been suggested that high levels of oxalic acid — which are present in spinach — might serve to block absorption of iron, as they do for calcium, magnesium and zinc. Other studies find that oxalic acid has no effect on iron in the diet, though, and hint that some other chemical in spinach might be getting in the way. ^

How universal is language of life?

Reply To Kenneth Miller On The Genetic Code
Discovery Institute's Center for Science & Culture
Discovery Institute


On Tuesday, September 25, 2001, Professor Kenneth Miller of Brown University issued a press release entitled "A 'Dying Theory' Fails Again," available here: 

http://www.ncseweb.org/resources/articles/3071_km-3.pdf

In this document, Miller claims that the Discovery Institute (DI) tried to "smear" PBS's Evolution series when the DI charged that program with making a false statement about the universality of the genetic code. Miller also claims that the DI failed to tell the public that "the very discoveries they cite provide elegant and unexpected support for Darwin's theories."

These claims are false. Miller's press release, however, provides an excellent teaching opportunity for the DI, not only to show why Miller's claims are false, but also to amplify our original objection. We shall explain why statements such as "the genetic code is universal" not only harm science -- by creating what Charles Darwin called "false facts" -- but also cheat the public, by concealing the real puzzles facing evolutionary theory. We conclude by touching on some of the deeper issues raised by patterns of evidence such as the genetic code.

We begin with the errors and misrepresentations in Miller's press release.

Miller completely misrepresents the significance of a diagram reproduced in his press release from another source (Knight et al. 2001, Figure 2). This is a serious mistake, as Miller rests his case against the DI on his misunderstanding of this diagram.

Miller equates genetic code variants to minor differences in dialects of the same spoken language (e.g., English). This comparison is erroneous and misleading.

Miller claims that the successes of biotechnology prove the universality of the code. This is untrue, and ignores the literature on experiments employing organisms with variant codes.

Let's consider each problem in more detail:

1. Miller completely misrepresents Knight et al.'s composite phylogeny of genetic codes.

In his press release, Miller writes:

"Look closely at the figure from this paper, and you;ll see something remarkable. The variations from the standard code occur in regular patterns that can be traced directly back to the standard code, which sits at the center of the diagram."

This is false. The variant codes do not "occur in regular patterns," but appear independently in unrelated lineages. Knight et al. explain this pattern of convergent (i.e., non-homologous) appearance in the article itself:

"The genetic code varies in a wide range of organisms (FIG. 2 [reproduced in Miller's press release], some of which share no obvious similarities. Sometimes the same change recurs in different lineages: for instance, the UAA and UAG codons have been reassigned from Stop to Gln in some diplomonads, in several species of ciliates and in the green alga Acetabularia acetabulum (reviewed in Ref. 5). Similarly, animal and yeast mitochondria have independently reassigned AUA from Ile to Met." [1] 

In their caption to Figure 2, Knight et al. note explicitly that variant codes have arisen "repeatedly and independently in different taxa." This pattern of convergent variation has generated much discussion in the primary literature. [2] If these are indeed convergent changes, they do not provide evidence of common descent at all, but rather would be misleading similarities that, taken by themselves, generate a false history of the organisms in question.

In short, Miller completely misrepresents the Knight et al. composite phylogeny. There is no "regular pattern" to the variant codes that maps congruently onto phylogenetic trees from other data. Thus, far from providing what Miller calls "unexpected confirmation of the evolution of the code from a single common ancestor," the pattern of variant codes represents a puzzle for a single tree of life. 

2. Variant genetic codes are not analogous to the differences between dialects of the same language.

In his press release, Miller writes:

"As evolutionary biologists were quick to realize, slight differences in the genetic code are similar to differences between the dialects of a single spoken language. The differences in spelling and word meanings between the American, Canadian, and British dialects of English reflect a common origin. Exactly the same is true for the universal language of DNA."

This is--at best--a wildly inaccurate analogy. From context and other clues, English speakers can discern that the words "center" and "centre," or "color" and "colour," refer to the same object. Meaning is preserved by context, and the reader moves along without a hitch.

But a gene sequence from a ciliated protozoan such as Tetrahymena (for instance), with the codons UAA and UAG in its open reading frame (ORF), cannot be interpreted correctly by the translation machinery of other eukaryotes having the so-called "universal" code. In Tetrahymena, UAA and UAG code for glutamine. In the universal code, these are stop codons. Thus the translation machinery of most other eukaryotes, when reading the Tetrahymena gene, would stop at UAA or UAG. Instead of inserting glutamine into the growing polypeptide chain, and continuing to translate the mRNA, release factors would bind to the codons, and the ribosomes would halt protein synthesis. The resulting protein would be truncated in length and very possibly non-functional. Unlike variant spellings of "center," therefore, context cannot preserve meaning. With the codons UAA and UAG (comparing Tetraphymena thermophila and other eukaryotes) no shared context exists.

Knight et al. present a much better analogy for code changes:

"Any change in the genetic code alters the meaning of a codon, which, analogous to reassigning a key on a keyboard, would introduce errors into every translated message." [3]

Indeed, for two decades (see below), it was exactly this deeply-embedded feature of the genetic code that led to strong predictions about its necessary universality across all organisms. It was widely thought that any change to the genetic code of an organism would affect all the proteins produced by that organism, leading to deleterious consequences (e.g., truncated or misfolded proteins) or lethality. Once the code evolved in the progenitor of all life, it "froze," and all subsequent organisms would carry that code.

In any case, the differences between genetic codes are not properly analogous to minor differences among dialects of a single language. 

3. Miller's references to biotechnology do not accurately represent the experimental literature on variant genetic codes.

In his press release, Miller writes:

"...the entire biotechnology industry is built upon the universality of the genetic code. Genetically-modified organisms are routinely created in the lab by swapping genes between bacteria, plants, animals, and viruses. If the coded instructions in those genes were truly as different as the critics of evolution would have you believe, none of these manipulations would work."

But some manipulations--namely, those involving organisms with variant codes--do not work, unless the researchers themselves intervene to ensure function. 

Consider, for instance, the release factor from the ciliate Tetrahymena thermophila. Release factors (in eukaryotes, these proteins are abbreviated as "eRF" to distinguish them from prokaryotic release factors) catalyze the separation of completed polypeptide chains (nascent proteins) from the ribosomal machinery. Unlike other eukaryotic release factors, however, that recognize all three stop codons (UAA, UGA, and UAG), the Tetrahymena thermophila release factor recognizes only the UGA codon as "stop."

In 1999, Andrew Karamyshev and colleagues at the University of Tokyo isolated the release factor (Tt-eRF1) from Tetrahymena thermophila. But in order to express and purify the protein, Karamyshev et al. had to manipulate it genetically first. Why? The Tetrahymena thermophila gene for Tt-eRF1 contains 10 codons in its open reading frame that would be interpreted as "stop" by other organisms--whereas Tetrahymena thermophila reads these codons as glutamine:

"To express and purify the recombinant Tt-eRF1 protein under heterologous expression conditions [i.e., in a cell other than Tetrahymena--Karamyshev et al. used yeast cells], 10 UAA/UAG triplets within the coding sequence were changed to the glutamine codon CAA or CAG by site-directed mutagenesis." [4]

Furthermore, Tt-eRF1 would not function when employed in combination with ribosomes (translation machinery) from other species:

"In spite of the overall conservative protein structure of Tt-eRF1 compared with mammalian and yeast eRF1s, the soluble recombinant Tt-eRF1 did not show any polypeptide release activity in vitro using rat or Artemia ribosomes." [5] Thus, when using an organism with a variant code (Tetrahymena thermophila), researchers found that

They needed to modify (i.e., intelligently manipulate) the gene sequences so that they could be expressed by other organisms, and

They discovered that a key component of the genetic code (namely, the release factor that terminates translation) would not function properly with the translation machinery of other organisms.

Experiments to change the identity of transfer RNA (tRNA)--another possible mechanism by which genetic codes might reassign codon "meanings"--have shown that the intermediate steps must be bridged by intelligent (directed) manipulation. In one such experiment, for instance, Margaret Saks, John Abelson, and colleagues at Caltech changed an E. coli arginine tRNA to specify a different amino acid, threonine. They accomplished this, however, only by supplying the bacterial cells (via a plasmid) with another copy of the wild-type threonine tRNA gene. This intelligently-directed intervention bridged the critical transition stage during which the arginine tRNA was being modified by mutations to specify threonine. [6] Indeed, in reporting on an earlier experiment to modify tRNA, Abelson and colleagues noted that "if multiple changes are required to alter the specificity of a tRNA, they cannot be selected but they can be constructed" [7]--constructed, that is, by intelligent design. We stress here that, in contrast to Miller's blithe dismissal of the difficulties raised for biotechnology by variant genetic codes, experts in the field caution that assuming a "universal" code may lead to serious problems. In a recent article on the topic entitled "Codon reassignment and the evolving genetic code: problems and pitfalls in post-genome analysis," Justin O'Sullivan and colleagues at the University of Kent observe:

"The emerging non-universal nature of the genetic code, coupled with the fact that few genetic codes have been experimentally confirmed, has several serious implications for the post-genome era. The production of biologically active recombinant molecules requires that careful consideration be given to both the expression system and the original host genome. The substitution of amino acids within a protein encoded by a nonstandard genetic code could alter the structure, function or antibody recognition of the final product." [8]

Thus, Miller's statements on biotechnology are highly misleading. Variant codes are not a minor matter easily overcome in experiments using different organisms.

We conclude by considering some of the deeper issues raised by Miller's press release. 

A little history and some basic logic

Not so very long ago, the universality of the genetic code was widely regarded as an important prediction (or confirmation) of the theory of common descent. Consider, for instance, an evolutionary biology textbook by the zoologist Mark Ridley, entitled The Problems of Evolution (Oxford University Press, 1985). In his first chapter, "Is Evolution True?" Ridley argues that common descent predicts a universal genetic code. His formulation of this argument mirrors dozens of similar arguments present in the biological literature from the mid-1960s to the mid-1980s:

"The outstanding example of a universal homology is the genetic code...The universality of the code is easy to understand if every species is descended from a common ancestor. Whatever code was used by the common ancestor would, through evolution, be retained. It would be retained because any change in it would be disastrous. A single change would cause all the proteins of the body, perfected over millions of years, to be built wrongly; no such body could live. It would be like trying to communicate, but having swapped letters around in words; if you change every 'a' for an 'x', for example, and tried talking to people, they would not make much sense of it. Thus we expect the genetic code to be universal if all species have descended from a common ancestor." [9]

Shortly after Ridley's argument was published in The Problems of Evolution, the evolutionary biologist Brian Charlesworth reviewed the book. He cautioned that Ridley was "less sound on the more modern aspects" of evolution, including the genetic code. Ridley's genetic code argument, Charlesworth worried,

"provides an opening for the creationists by asserting that the genetic code is universal, whereas it is now known that slight deviations from the standard code occur in mitochondria and in Mycoplasma." [10]

But how did Ridley create "an opening for the creationists," if the genetic code variants are as insignificant as Kenneth Miller suggests?

Here we should consider a basic feature of the logic of scientific prediction. If a theory, T, strongly predicts a particular outcome, O, but O is not observed, then one has grounds for doubting T. Of course, this logical schema greatly oversimplifies how scientists may actually behave when met with a failed prediction. One can shift or broaden the prediction--"T didn't really predict O, but actually O plus something else"--or one can throw doubt onto some theory other than T, and blame it, rather than T, for the failed prediction.

The problem is that both of these solutions weaken one's case for the theory T. Any theory that predicts an observational outcome and its negation is a theory without much empirical power. "It will rain today and it won't rain today" tells one everything and therefore nothing. If common descent predicts that the genetic code will be universal, except when it is not universal, then common descent does not actually specify any observations about the code.

One might also say that some other theory, linked conceptually to common descent, is responsible for the failed prediction of universality. In this move, the truth of common descent is preserved while another part of our biological knowledge pays the cost. Most biologists working on the evolution of the code have taken this route; Niles Lehman of SUNY-Albany, for instance, writes:

"Once thought universal, the specific relationships between amino acids and codons that are collectively known as the genetic code are now proving to be variable in many taxa. While this realization has been disappointing to some--the genetic code was often hailed as the ultimate evolutionary anchor in that is universality was perhaps the indisputable piece of evidence that all life shared a common ancestor at some point--it has also opened up a rich field of evolutionary analysis by forcing us to consider what sequence of molecular events in a cell could possibly allow for codon reassignment." [11] 

Again, however, this move weakens the case for common descent. One preserves the truth of common descent only by cashing in one of the theory's predictions, namely, the universality of the code. "It seems we were wrong, after all, about the genetic code not being able to vary. So let's figure out how variant codes arise."

Well, how do variant code arise? Kenneth Miller doesn't say, but that is not surprising. No one really knows, although that is not for a lack of theories. Here we refer the curious reader to the superb review article by Knight, Freeland, and Landweber (2001), who list several different theories explaining codon change, none of which (they note) is unequivocally supported by the evidence.

Is it possible that the variant codes derived from a single common ancestor? Yes. 

It is also possible, of course, that they did not. Miller assumes that a single origin is the case, but there is a world of difference between assumptions and real knowledge.

These are matters for legitimate debate. What is not a matter for debate are the following facts:

The genetic code is not universal.

If the theory of common descent predicts a universal genetic code, then the theory predicts something that isn't so.

References

1. Robin D. Knight, Stephen J. Freeland, and Laura F. Landweber, "Rewiring the Keyboard: Evolvability of the Genetic Code," Nature Reviews Genetics, Vol. 2:49-58; p. 49 (2001).

2. Catherine A. Lozupone, Robin D. Knight and Laura F. Landweber, "The molecular basis of nuclear genetic code change in ciliates," Current Biology 11 (2001):65-74; Patrick J. Keeling and W. Ford Doolittle, "Widespread and Ancient Distribution of a Noncanonical Genetic Code in Diplomonads," Molecular Biology and Evolution 14 (1997):895-901; A. Baroin-Tourancheau, N. Tsao, L.A. Klobutcher, R.E. Pearlman, and A. Adoutte, "Genetic code deviations in the ciliates: evidence for multiple and independent events," EMBO Journal 14 (995):3262-3267. 

3. Robin D. Knight, Stephen J. Freeland, and Laura F. Landweber, "Rewiring the Keyboard: Evolvability of the Genetic Code," Nature Reviews Genetics 2 (2001):49-58; p. 49. 

4. Andrew L. Karamyshev, Koichi Ito, and Yoshikazu Nakamura, "Polypepetide release factor eRF1 from Tetrahymena themophila: cDNA cloning, purification and complex formation with yeast eRF3," FEBS Letters 457 (1999):483-488; p. 485. 

5. Ibid., p. 487.

6. Margaret E. Saks, Jeffrey R. Sampson, and John Abelson, "Evolution of a Transfer RNA Gene Through a Point Mutation in the Anticodon," Science 279 (13 March 1998):1665-1670.

7. Jennifer Normanly, Richard C. Ogden, Suzanna J. Horvath & John Abelson, "Changing the identity of a transfer RNA," Nature 321 (15 May 986):213-219. 

8. Justin M. O'Sullivan, J. Bernard Davenport and Mick F. Tuite, "Codon reassignment and the evolving genetic code: problems and pitfalls in post-genome analysis," Trends in Genetics 17 (2001):20-22; p. 21. 

9. Mark Ridley, The Problems of Evolution (Oxford: Oxford University Press, 1985), pp. 10-11.

10. Brian Charlesworth, "Darwinism is alive and well," review of The Problems of Evolution, New Scientist 11 July 1985, p. 58. 


11. Niles Lehman, "Please release me, genetic code," Current Biology 11 (2001):R63-R66; p. R63.

Sunday, 30 April 2017

Psalms 8-14 American Standard Version.

8)1 O Jehovah, our Lord, How excellent is thy name in all the earth, Who hast set thy glory upon the heavens!

2Out of the mouth of babes and sucklings hast thou established strength, Because of thine adversaries, That thou mightest still the enemy and the avenger.

3When I consider thy heavens, the work of thy fingers, The moon and the stars, which thou hast ordained;

4What is man, that thou art mindful of him? And the son of man, that thou visitest him?

5For thou hast made him but little lower than God, And crownest him with glory and honor.

6Thou makest him to have dominion over the works of thy hands; Thou hast put all things under his feet:

7All sheep and oxen, Yea, and the beasts of the field,

8The birds of the heavens, and the fish of the sea, Whatsoever passeth through the paths of the seas.

9O Jehovah, our Lord, How excellent is thy name in all the earth!
American Standard Version, 1901
9)1I will give thanks unto Jehovah with my whole heart; I will show forth all thy marvellous works.

2I will be glad and exult in thee; I will sing praise to thy name, O thou Most High.

3When mine enemies turn back, They stumble and perish at thy presence.

4For thou hast maintained my right and my cause; Thou sittest in the throne judging righteously.

5Thou hast rebuked the nations, thou hast destroyed the wicked; Thou hast blotted out their name for ever and ever.

6The enemy are come to an end, they are desolate for ever; And the cities which thou hast overthrown, The very remembrance of them is perished.

7But Jehovah sitteth as king for ever: He hath prepared his throne for judgment;

8And he will judge the world in righteousness, He will minister judgment to the peoples in uprightness.

9Jehovah also will be a high tower for the oppressed, A high tower in times of trouble;

10And they that know thy name will put their trust in thee; For thou, Jehovah, hast not forsaken them that seek thee.

11Sing praises to Jehovah, who dwelleth in Zion: Declare among the people his doings.

12For he that maketh inquisition for blood remembereth them; He forgetteth not the cry of the poor.

13Have mercy upon me, O Jehovah; Behold my affliction which I suffer of them that hate me, Thou that liftest me up from the gates of death;

14That I may show forth all thy praise. In the gates of the daughter of Zion I will rejoice in thy salvation.

15The nations are sunk down in the pit that they made: In the net which they hid is their own foot taken.

16Jehovah hath made himself known, he hath executed judgment: The wicked is snared in the work of his own hands. Higgaion. Selah

17The wicked shall be turned back unto Sheol, Even all the nations that forget God.

18For the needy shall not alway be forgotten, Nor the expectation of the poor perish for ever.

19Arise, O Jehovah; let not man prevail: Let the nations be judged in thy sight.

20Put them in fear, O Jehovah: Let the nations know themselves to be but men. Selah

American Standard Version, 1901

10)1Why standest thou afar off, O Jehovah? Why hidest thou thyself in times of trouble?

2In the pride of the wicked the poor is hotly pursued; Let them be taken in the devices that they have conceived.

3For the wicked boasteth of his heart's desire, And the covetous renounceth, yea , contemneth Jehovah.

4The wicked, in the pride of his countenance,'saith , He will not require it . All his thoughts are, There is no God.

5His ways are firm at all times; Thy judgments are far above out of his sight: As for all his adversaries, he puffeth at them.

6He saith in his heart, I shall not be moved; To all generations I shall not be in adversity.

7His mouth is full of cursing and deceit and oppression: Under his tongue is mischief and iniquity.

8He sitteth in the lurking-places of the villages; In the secret places doth he murder the innocent; His eyes are privily set against the helpless.

9He lurketh in secret as a lion in his covert; He lieth in wait to catch the poor: He doth catch the poor, when he draweth him in his net.

10He croucheth, he boweth down, And the helpless fall by his strong ones.

11He saith in his heart, God hath forgotten; He hideth his face; he will never see it.

12Arise, O Jehovah; O God, lift up thy hand: Forget not the poor.

13Wherefore doth the wicked contemn God, And say in his heart, Thou wilt not require it ?

14Thou hast seen it ; for thou beholdest mischief and spite, to requite it with thy hand: The helpless committeth himself unto thee; Thou hast been the helper of the fatherless.

15Break thou the arm of the wicked; And as for the evil man, seek out his wickedness till thou find none.

16Jehovah is King for ever and ever: The nations are perished out of his land.

17Jehovah, thou hast heard the desire of the meek: Thou wilt prepare their heart, thou wilt cause thine ear to hear;

18To judge the fatherless and the oppressed, That man who is of the earth may be terrible no more.

American Standard Version, 1901
11)1In Jehovah do I take refuge: How say ye to my soul, Flee as a bird to your mountain;

2For, lo, the wicked bend the bow, They make ready their arrow upon the string, That they may shoot in darkness at the upright in heart;

3If the foundations be destroyed, What can the righteous do?

4Jehovah is in his holy temple; Jehovah, his throne is in heaven; His eyes behold, his eyelids try, the children of men.

5Jehovah trieth the righteous; But the wicked and him that loveth violence his soul hateth.

6Upon the wicked he will rain snares; Fire and brimstone and burning wind shall be the portion of their cup.

7For Jehovah is righteous; he loveth righteousness: The upright shall behold his face.

American Standard Version, 1901
12)1Help, Jehovah; for the godly man ceaseth; For the faithful fail from among the children of men.

2They speak falsehood every one with his neighbor: With flattering lip, and with a double heart, do they speak.

3Jehovah will cut off all flattering lips, The tongue that speaketh great things;

4Who have said, With our tongue will we prevail; Our lips are our own: who is lord over us?

5Because of the oppression of the poor, because of the sighing of the needy, Now will I arise, saith Jehovah; I will set him in the safety he panteth for.

6The words of Jehovah are pure words; As silver tried in a furnace on the earth, Purified seven times.

7Thou wilt keep them, O Jehovah, Thou wilt preserve them from this generation for ever.

8The wicked walk on every side, When vileness is exalted among the sons of men.

American Standard Version, 1901
12)1How long, O Jehovah? wilt thou forget me for ever? How long wilt thou hide thy face from me?

2How long shall I take counsel in my soul, Having sorrow in my heart all the day? How long shall mine enemy be exalted over me?

3Consider and answer me, O Jehovah my God: Lighten mine eyes, lest I sleep the'sleep of death;

4Lest mine enemy say, I have prevailed against him; Lest mine adversaries rejoice when I am moved.

5But I have trusted in thy lovingkindness; My heart shall rejoice in thy salvation.

6I will sing unto Jehovah, Because he hath dealt bountifully with me.

American Standard Version, 1901
14)1The fool hath said in his heart, There is no God. They are corrupt, they have done abominable works; There is none that doeth good.

2Jehovah looked down from heaven upon the children of men, To see if there were any that did understand, That did seek after God.

3They are all gone aside; they are together become filthy; There is none that doeth good, no, not one.

4Have all the workers of iniquity no knowledge, Who eat up my people as they eat bread, And call not upon Jehovah?

5There were they in great fear; For God is in the generation of the righteous.

6Ye put to shame the counsel of the poor, Because Jehovah is his refuge.

7Oh that the salvation of Israel were come out of Zion! When Jehovah bringeth back the captivity of his people, Then shall Jacob rejoice, and Israel shall be glad.
American Standard Version, 1901



Where's Occam's razor when you need it?

As if the Multiverse Wasn't Bizarre Enough ... Meet Many Worlds
Denyse O'Leary December 16, 2013 5:28 AM


In 1957 physicist Hugh Everett suggested the "Many Worlds" hypothesis as a proposed interpretation of quantum mechanics. He suggested that the universe constantly splits into different futures each time a subatomic particle goes one way as opposed to the other. Everett thus promptly exited theoretical physics.

However now, some hope that combining Everett's "many worlds" theory with the multiverse will strengthen current cosmology. New Scientist's Justin Mullins explains:
Two of the strangest ideas in modern physics -- that the cosmos constantly splits into parallel universes in which every conceivable outcome of every event happens, and the notion that our universe is part of a larger multiverse -- have been unified into a single theory.
In other words, not only is there an infinite number of universes, but they come into existence every time you turn right instead of left.
Today, such ideas come thicker, faster. We are told, "Our cosmos was 'bruised' in collisions with other, never-observed universes. Or we are living in a giant hologram.* Alternatively, a University of Washington team enterprisingly suggests that maybe the universe is fine-tuned because we are living in a computer simulation, one constructed by super-intelligent descendants who have gone back in time. Science writer Ray Villard offers:

Before you dismiss this idea as completely loony, the reality of such a Sim Universe might solve a lot of eerie mysteries about the cosmos. About two-dozen of the universe's fundamental constants happen to fall within the narrow range thought to be compatible with life. At first glance it seems as unlikely as balancing a pencil on its tip. Jiggle these parameters and life as we know it would have never appeared. Not even stars and galaxies. This is called the Anthropic principle ... We are also living at a very special time in the universe's history where it switched gears from decelerating to accelerating under the push of dark energy. This begs the question "why me why now?"
Before anyone attempts to offer an alternative to the Sim, he advises:
Biblical creationists can no doubt embrace these seeming cosmic coincidences as unequivocal evidence for their "theory" of Intelligent Design (ID). But is our "God" really a computer programmer rather than a bearded old man living in the sky?
Science-Fictions-square.gifVillard implies that only Biblical creationists think that fine-tuning points to a First Cause. That's a fiction he needs; his proposed alternative is even more bizarre than his caricatured mash-up of creationists and advocates of intelligent design.
Never mind, Lee Smolin and colleagues aim to take relativity to a whole new level, with space-time in their sights: "They say we need to forget about the home Einstein invented for us: we live instead in a place called phase space." Which, we are told, is a "curious eight-dimensional world that merges our familiar four dimensions of space and time and a four-dimensional world called momentum space. Smolin, incidentally, does not think that there is a scientific method, just scientific ethics and that laws of nature can evolve over time, in Darwinian fashion. Physics now bows to Darwinian theory, where once it was the reverse.

The eternal cyclic universe is also back. Perimeter Institute cosmologist Neil Turok informs us:

I'm exploring the idea that the singularity was not the beginning of time. In this new view, time didn't have a beginning, and the Big Bang resulted from a collision of branes, sheetlike spaces that exist within a higher-dimensional reality. These collisions might happen repeatedly, creating an eternal, cyclic universe.
Indeed, we are told, time flows backward. Thinking it travels exclusively forward "may be not just an illusion but a lie ... " Discover's Zeeya Merali says. Max Tegmark suggests, "Perhaps we will gradually get used to the weird ways of our cosmos and find its strangeness to be part of its charm."
"Time need not end in the multiverse," burbles science writer Amanda Gefter, as if time or anything else would mean anything in a multiverse. Others suggest, maybe the universes are a wave function. And our own universe may exist inside a black hole. Not to worry, our "original" universe will eventually be populated by "a near-infinite number of advanced, virtual civilizations" featuring "autonomous, conscious beings."

And there is the usual, indeed endless, moralizing: Tegmark proclaims, "We humans have a well-documented tendency toward hubris, arrogantly imagining ourselves at center stage, with everything revolving around us." And cosmologist Raphael Bousso at the University of California, Berkeley, accuses the science community of the sin of "lying to ourselves," by refusing to assume that the "many worlds" theory may be true. Lack of clear evidence has, apparently, nothing to do with it.

By the way, there is life after death: Some clever beings might survive our universe's predicted demise, provided they develop suitable technologies. Apocalypses are also on offer: Stephen Hawking doubts humans will survive another thousand years without escaping Earth.

And God is back too, but not like you remember him. Agnostic physicist Paul Davies explains:

Far from doing away with a transcendent Creator, the multiverse theory actually injects that very concept at almost every level of its logical structure. Gods and worlds, creators and creatures, lie embedded in each other, forming an infinite regress in unbounded space.
So God turns out to be just one more note of cacophony in the transcendent goofiness.
Remember, all this got started just to explain away fine-tuning.

We are told that we are "on the brink of understanding everything," when our cosmology guarantees that we can understand nothing and there is nothing to understand anyway. Everything, you see, is true -- for fifteen seconds.

* Some say the hologram universe originated in an argument Stephen Hawking had with other physicists.