Search This Blog

Saturday, 19 January 2019

Toward a testable design filter?

Measuring Surprise — A Frontier of Design Theory
Evolution News @DiscoveryCSC


The sunlight shines bright on the cold winter’s morning as you begin your trek towards the retreat. Snow covers the ground and steam from your breath rises ahead of you. Accompanying you is Bertrand, your Russell terrier, who runs ahead of you jumping in the snow. Chasing a bird, he climbs over a hill as you call after him, but he is too focused on the pursuit to heed you. 

Clumsily chasing after him you come upon a strange looking stone protruding from one of the rock faces. Its odd shape catches your eye, as does its relatively smooth surface. There appear to be runes carved its surface, though you aren’t sure, since you don’t recognize the symbols or know of any literate ancient cultures from the area. 

You decide to leave the stone as you found it, but mark its location and pull a notepad from your backpack to sketch the stone with its symbols. Bertrand, tired from his chase, joins you and begins digging nearby, where he unearths what appears to be a piece of aged metal, again with symbols you do not recognize. The symbols differ from those carved in the rock, are more refined, and almost appear to be numeric. 

Gently moving more earth, you discover a second piece of twisted metal, and you add drawings of these pieces to your sketchbook, resisting the urge to take the pieces with you. After sketching, you continue your trek towards your retreat. On arriving, you contact the local university about your discovery, helping them to locate the artifacts on the following day.

You’ve come to the retreat to study. You’ve brought several books from your office, along with a manuscript on the subject of complex specified information. As you read the manuscript, you begin applying the ideas to your discovery in the hills. What could have created the carvings? 

The carvings look sustained (there are many of them) and deliberate, unlike creases created by splitting and pitting of surfaces over ages. You’re no geologist, but you are also no stranger to rock surfaces, possessing a mature mental model of the types of patterns that can be expected to appear on stone faces. The patterns are geometric but irregular, complex and without any apparent repetition, unlike other geological anomalies such as the Giant’s Causeway of Ireland. 

The runes were most likely carvings, made by people in some unknown past. Could you compute some estimates to how likely a series of runes like this (or in any other symbol system) would be to appear as a process of weathering? That seems like a challenging task, but the metal pieces present perhaps a less formidable challenge, since you are almost certain they represent numbers. 

You set out to discover whether you can quantify your intuition that the carvings are special, using the tool of specified complexity.

Unlikely Yet Structurally Organized

What is specified complexity? Almost a decade before the discovery of the structure of the DNA molecule, physicist Erwin Schrödinger predicted that hereditary material must be stored in what he called an aperiodic crystal, stable yet without predictable repetition, since predictable repetition would greatly reduce its information carrying capacity (Schrödinger 1944). 

Starting from first principles, he reasoned that life would need an informational molecule that could take on a large number of possible states without strong bias towards any one particular state (thus making individual states improbable), yet needed structural stability to counteract the forces of Brownian motion within cells (thus making the molecule match a functional specification of being structurally organized). 

This combination of unlikely objects that simultaneously match a functional specification later came to be known as specified complexity (Dembski 1998; Dembski 2001; Dembski 2002; Dembski 2005; Ewert, Dembski, and Marks II 2012). Specified complexity has been proposed as a signal of design (Dembski 1998; Dembski 2001; Dembski 2002). An object exhibiting specified complexity is unlikely to have been produced by the probabilistic process under which it is being measured and it is also specified, matching some independently given pattern called a specification. More precisely, the degree to which an object meets some independently defined criterion in a way that not many objects do is the degree to which the object can be said to be specified. 

Because complex objects typically contain many parts, each of which makes the overall probability of the object being encountered less likely, the improbability aspect has historically been referred to as the complexity of the object (though, improbability would perhaps be more fitting). Therefore, specified complex objects are those that are both unlikely and functionally specified, often having to meet minimum thresholds in both categories.

Quantifying Surprise

Specified complexity allows us to measure how surprising random outcomes are, in reference to some probabilistic model. But there are other ways of measuring surprise. In Shannon’s celebrated information theory (Shannon 1948), improbability alone can be used to measure the surprise of observing a particular random outcome, using the quantity of surprisal, which is simply the negative logarithm (base 2) of the probability of observing the outcome, namely,

-log2p(x)

where x is the observed outcome and p(x) is the probability of observing it under some distribution p. Unlikely outcomes generate large surprisal values, since they are in some sense unexpected.

But let us consider a case where all events in a set of possible outcomes are equally very unlikely. (This can happen when you have an extremely large number of equally possible outcomes, so that each of them individually has a small chance of occurring.) 

Under these conditions, asking “what is the probability that an unlikely event occurs?” yields the somewhat paradoxical answer that it is guaranteed to occur! Some outcome must occur, and since each of them is unlikely, an unlikely event (with large surprisal) is guaranteed to occur. Therefore, surprisal alone cannot tell us how likely we are to witness an outcome that surprises us.

As a concrete example, consider any sequence of one hundred coin flips generated by flipping a fair coin. Every sequence has an equal probability of occurring, giving the same surprisal for each possible sequence. Therefore a sequence of all heads has the exact same surprisal as a random sequence of one hundred zeros and ones, even though the former is surely more surprising than the latter under a fair coin model.

We need another way to capture what it means for an outcome to be special and surprising, one that would allow us to say a sequence of all heads generated by a fair coin is surprising, but a sequence of randomly mixed zeros and ones is not. Specified complexity provides a mathematical means of doing so, by combining a surprisal term with a specification term, allowing us to precisely determine how surprising it is to witness an outcome of one hundred heads in a row assuming a fair coin.

Diving into Specified Complexity

How does specified complexity allow us to do this? A recently published paper in BIO-Complexity, “A Unified Model of Complex Specified Information” by machine learning researcher George D. Montañez, offers some insight. For a reader-friendly summary see, “BIO-Complexity Article Offers an Objective Method for Weighing Darwinian Explanations.”

The paper, which is mathematical in nature, ties together several existing models of specified complexity and introduces a canonical form for which objects exhibiting large specified complexity values are unlikely (surprising!) under any given distribution. Montañez builds on much previous work, fleshing out the equivalence between specified complexity testing and p-value hypothesis testing introduced by A. Milosavljević (Milosavljević 1993; Milosavljević 1995) and later William Dembski (Dembski 2005), and giving bounds on the probability of encountering large specified complexity values for existing specified complexity models. 

The paper defines new canonical specified complexity model variants, and gives a recipe for creating specified complexity models using specification functions of your choice. It lays out a framework for reasoning quantitatively about what it means for a probabilistic outcome to be genuinely surprising, and explores what implications this has for technology and for explanations of observed outcomes.

We’ll have more to say about this important paper, which represents a frontier for the theory of intelligent design. Stay tuned.

Bibliography

Dembski, William A. 1998. The Design Inference: Eliminating Chance Through Small Probabilities. Cambridge: Cambridge University Press. https://doi.org/10.1017/CBO9780511570643.
———. 2001. “Detecting Design by Eliminating Chance: A Response to Robin Collins.” Christian Scholar’s Review 30 (3): 343–58.
———. 2002. No Free Lunch: Why Specified Complexity Cannot Be Purchased Without Intelligence. Lanham: Rowman & Littlefield.
———. 2005. “Specification: The Pattern That Signifies Intelligence.” Philosophia Christi 7 (2): 299–343. https://doi.org/10.5840/pc20057230.
Ewert, Winston, William A Dembski, and Robert J Marks II. 2012. “Algorithmic Specified Complexity.” Engineering and Metaphysicshttps://doi.org/10.33014/isbn.0975283863.7.
Milosavljević, Aleksandar. 1993. “Discovering Sequence Similarity by the Algorithmic Significance Method.” In ISMB, 284–91.
———. 1995. “Discovering Dependencies via Algorithmic Mutual Information: A Case Study in Dna Sequence Comparisons.” Machine Learning 21 (1-2): 35–50.
Schrödinger, Erwin. 1944. What Is Life? The Physical Aspect of the Living Cell and Mind. Cambridge: Cambridge University Press.
Shannon, Claude Elwood. 1948. “A Mathematical Theory of Communication.” Bell System Technical Journal 27 (3): 379–423.
Photo credit: A stone carved with ancient runes, by Lindy Buckley, via Flickr (cropped).

The iron lady saved Britain?:Pros and cons.

Darwinian apologists are pounding the table again.

Fact-Check: Louisiana's Science Education Act Does NOT Authorize Teaching Creationism
Sarah Chaffee 

In an article at Vox, a website the offers to "explain the news" for readers, Sean Illing shares an interview with science educator Amanda Glaze. Unfortunately, in "Teaching evolution in the South: an educator on the 'war for science literacy,'" he repeats the mistake of many media sources, mischaracterizing an academic freedom law as authorizing instructors to teach creationism.

I lived and taught in Louisiana until recently, and there you had a well-educated Republican governor [Bobby Jindal] who was backing a law that allowed creationism to be taught in public school science classes. And he had the overwhelming support of the state legislature.

This is incorrect. Permit me to explain to Sean Illing. The law that he refers to, the Louisiana Science Education Act (LSEA), does not authorize the teaching of creationism. Rather, it permits teachers to present the scientific evidence both for and against neo-Darwinism. (Illing also contests whether there is indeed a scientific debate -- more about the evidence and controversy here.)

The text of the law includes the following statement:

This Section shall not be construed to promote any religious doctrine, promote discrimination for or against a particular set of religious beliefs, or promote discrimination for or against religion or nonreligion.

As I mentioned in a previous article on the LSEA:

Let's be clear: If a teacher presents creationism and is sued, the LSEA will offer that teacher no protection.... In any event, teaching creationism in public schools is unconstitutional according to the Supreme Court (Edwards v. Aguillard, 482 U.S. 578).

Louisiana's academic freedom law serves the purpose of giving teachers who would like to present both sides of the scientific controversy over evolution the freedom to do so without fear of retaliation. But media accounts often fail to portray this clearly.

Romancing the theory?


Beauty ≠ truth

Scientists prize elegant theories, but a taste for simplicity is a treacherous guide. And it doesn’t even look good
Albert Einstein's theory of general relativity is a century old next year and, as far as the test of time is concerned, it seems to have done rather well. For many, indeed, it doesn’t merely hold up: it is the archetype for what a scientific theory should look like. Einstein’s achievement was to explain gravity as a geometric phenomenon: a force that results from the distortion of space-time by matter and energy, compelling objects – and light itself – to move along particular paths, very much as rivers are constrained by the topography of their landscape. General relativity departs from classical Newtonian mechanics and from ordinary intuition alike, but its predictions have been verified countless times. In short, it is the business.
Einstein himself seemed rather indifferent to the experimental tests, however. The first came in 1919, when the British physicist Arthur Eddington observed the Sun’s gravity bending starlight during a solar eclipse. What if those results hadn’t agreed with the theory? (Some accuse Eddington of cherry-picking the figures anyway, but that’s another story.) ‘Then,’ said Einstein, ‘I would have been sorry for the dear Lord, for the theory is correct. 
This sort of talk both delights today’s physicists and makes them a little nervous. After all, isn’t experiment – nature itself – supposed to determine truth in science? What does beauty have to do with it? ‘Aesthetic judgments do not arbitrate scientific discourse,’ the string theorist Brian Greene reassures his readers in The Elegant Universe (1999), the most prominent work of physics exposition in recent years. ‘Ultimately, theories are judged by how they fare when faced with cold, hard, experimental facts.’ Einstein, Greene insists, didn’t mean to imply otherwise – he was just saying that beauty in a theory is a good guide, an indication that you are on the right track.That was Einstein all over. As the Danish physicist Niels Bohr commented at the time, he was a little too fond of telling God what to do. But this wasn’t sheer arrogance, nor parental pride in his theory. The reason Einstein felt general relativity must be right is that it was toobeautiful a theory to be wrong.
Einstein isn’t around to argue, of course, but I think he would have done. It was Einstein, after all, who said that ‘the only physical theories that we are willing to accept are the beautiful ones’. And if he was simply defending theory against too hasty a deference to experiment, there would be plenty of reason to side with him – for who is to say that, in case of a discrepancy, it must be the theory and not the measurement that is in error? But that’s not really his point. Einstein seems to be asserting that beauty trumps experience come what may.
He wasn’t alone. Here’s the great German mathematician Hermann Weyl, who fled Nazi Germany to become a colleague of Einstein’s at the Institute of Advanced Studies in Princeton: ‘My work always tries to unite the true with the beautiful; but when I had to choose one or the other, I usually chose the beautiful.’ So much for John Keats’s ‘Beauty is truth, truth beauty.’ And so much, you might be tempted to conclude, for scientists’ devotion to truth: here were some of its greatest luminaries, pledging obedience to a different calling altogether.
Was this kind of talk perhaps just the spirit of the age, a product of fin de siècle romanticism? It would be nice to think so. In fact, the discourse about aesthetics in scientific ideas has never gone away. Even Lev Landau and Evgeny Lifshitz, in their seminal but pitilessly austere midcentury Course of Theoretical Physics, were prepared to call general relativity ‘probably the most beautiful of all existing theories’. Today, popularisers such as Greene are keen to make beauty a selling point of physics. Writing in this magazine last year, the quantum theorist Adrian Kent speculated that the very ugliness of certain modifications of quantum mechanics might count against their credibility. After all, he wrote, here was a field in which ‘elegance seems to be a surprisingly strong indicator of physical relevance’.
We have to ask: what is this beauty they keep talking about?
S
ome scientists are a little coy about that. The Nobel Prize-winning physicist Paul Dirac agreed with Einstein, saying in 1963 that ‘it is more important to have beauty in one’s equations than to have them fit experiment’ (how might Greene explain that away?). Yet faced with the question of what this all-important beauty is, Dirac threw up his hands. Mathematical beauty, he said, ‘cannot be defined any more than beauty in art can be defined’ – though he added that it was something ‘people who study mathematics usually have no difficulty in appreciating’. That sounds rather close to the ‘good taste’ of his contemporaneous art critics; we might fear that it amounts to the same mixture of prejudice and paternalism.


Given this history of evasion, it was refreshing last November to hear the theoretical physicist Nima Arkani-Hamed spell out what ‘beauty’ really means for him and his colleagues. He was talking to the novelist Ian McEwan at the Science Museum in London, during the opening of the museum’s exhibition on the Large Hadron Collider. ‘Ideas that we find beautiful,’ Arkani-Hamed explained, ‘are not a capricious aesthetic judgment’:
It’s not fashion, it’s not sociology. It’s not something that you might find beautiful today but won’t find beautiful 10 years from now. The things that we find beautiful today we suspect would be beautiful for all eternity. And the reason is, what we mean by beauty is really a shorthand for something else. The laws that we find describe nature somehow have a sense of inevitability about them. There are very few principles and there’s no possible other way they could work once you understand them deeply enough. So that’s what we mean when we say ideas are beautiful.
Does this bear any relation to what beauty means in the arts? Arkani-Hamed had a shot at that. Take Ludwig van Beethoven, he said, who strove to develop his Fifth Symphony in ‘perfect accordance to its internal logical structure’.
It is precisely this that delights mathematicians in a great proof: not that it is correct but that it shows a tangibly human genius
Beethoven is indeed renowned for the way he tried out endless variations and directions in his music, turning his manuscripts into inky thickets in his search for the ‘right’ path. Novelists and poets, too, can be obsessive in their pursuit of the mot juste. Reading the novels of Patrick White or the late works of Penelope Fitzgerald, you get the same feeling of almost logical necessity, word by perfect word.
But you notice this quality precisely because it is so rare. What generally brings a work of art alive is not its inevitability so much as the decisions that the artist made. We gasp not because the words, the notes, the brushstrokes are ‘right’, but because they are revelatory: they show us not a deterministic process but a sensitive mind making surprising and delightful choices. In fact, pure mathematicians often say that it is precisely this quality that delights them in a great proof: not that it is correct but that it shows a personal, tangibly human genius taking steps in a direction we’d never have guessed.
‘The things that we find beautiful today we suspect would be beautiful for all eternity’: here is where Arkani-Hamed really scuppers the notion that the kind of beauty sought by science has anything to do with the major currents of artistic culture. After all, if there’s one thing you can say about beauty, it is that the beholder has a lot to do with it. We can still find beauty in the Paleolithic paintings at Lascaux and the music of William Byrd, while admitting that a heck of a lot of beauty really is fashion and sociology. Why shouldn’t it be? How couldn’t it be? We still swoon at Jan van Eyck. Would van Eyck’s audience swoon at Mark Rothko?
T
he gravest offenders in this attempted redefinition of beauty are, of course, the physicists. This is partly because their field has always been heir to Platonism – the mystical conviction of an orderly cosmos. Such a belief is almost a precondition for doing physics in the first place: what’s the point in looking for rules unless you believe they exist? The MIT physicist Max Tegmark now goes so far as to say that mathematics constitutes the basic fabric of reality, a claim redolent of Plato’s most extreme assertions in Timaeus.


But Platonism will not connect you with the mainstream of aesthetic thought – not least because Plato himself was so distrustful of art (he banned the lying poets from his Republic, after all). Better that we turn to Immanuel Kant. Kant expended considerable energies in hisCritique of Judgment (1790) trying to disentangle the aesthetic aspects of beauty from the satisfaction one feels in grasping an idea or recognising a form, and it does us little good to jumble them up again. All that conceptual understanding gives us, he concluded, is ‘the solution that satisfies the problem… not a free and indeterminately final entertainment of the mental powers with what is called beautiful’. Beauty, in other words, is not a resolution: it opens the imagination.
Physicists might be the furthest gone along Plato’s trail, but they are not alone. Consider the many chemists whose idea of beauty seems to be dictated primarily by the molecules they find pleasing – usually because of some inherent mathematical symmetry, such as in the football-shaped carbon molecule buckminsterfullerene (strictly speaking, a truncated icosahedron). Of course, this is just another instance of mathematics-worship, yoking beauty to qualities of regularity that were not deemed artistically beautiful even in antiquity. Brian Greene claims: ‘In physics, as in art, symmetry is a key part of aesthetics.’ Yet for Plato it was precisely art’s lack of symmetry (and thus intelligibility) that denied it access to real beauty. Art was just toomessy to be beautiful.
In seeing matters the other way around, Kant speaks for the mainstream of artistic aesthetics: ‘All stiff regularity (such as approximates to mathematical regularity) has something in it repugnant to taste.’ We weary of it, as we do a nursery rhyme. Or as the art historian Ernst Gombrich put it in 1988, too much symmetry ensures that ‘once we have grasped the principle of order… it holds no more surprise’. Artistic beauty, Gombrich believed, relies on a tension between symmetry and asymmetry: ‘a struggle between two opponents of equal power, the formless chaos, on which we impose our ideas, and the all-too-formed monotony, which we brighten up by new accents’. Even Francis Bacon (the 17th-century proto-scientist, not the 20th-century artist) understood this much: ‘There is no excellent beauty that hath not some strangeness in the proportion.’
Perhaps I have been a little harsh on the chemists – those cube- and prism-shaped molecules are fun in their own way. But Bacon, Kant and Gombrich are surely right to question their aesthetic merit. As the philosopher of chemistry Joachim Schummer pointed out in 2003, it is simply parochial to redefine beauty as symmetry: doing so cuts one off from the dominant tradition in artistic theory. There’s a reason why our galleries are not, on the whole, filled with paintings of perfect spheres.
W
hy shouldn’t scientists be allowed their own definition of beauty? Perhaps they should. Yet isn’t there a narrowness to the standard that they have chosen? Even that might not be so bad, if their cult of ‘beauty’ didn’t seem to undermine the credibility of what they otherwise so strenuously assert: the sanctity of evidence. It doesn’t matter who you are, they say, how famous or erudite or well-published: if your theory doesn’t match up to nature, it’s history. But if that’s the name of the game, why on earth should some vague notion of beauty be brought into play as an additional arbiter?


Because of experience, they might reply: true theories are beautiful. Well, general relativity might have turned out OK, but plenty of others have not. Take the four-colour theorem: the proposal that it is possible to colour any arbitrary patchwork in just four colours without any patches of the same colour touching one another. In 1879 it seemed as though the British mathematician Alfred Kempe had found a proof – and it was widely accepted for a decade, because it was thought beautiful. It was wrong. The current proof is ugly as heck – it relies on a brute-force exhaustive computer search, which some mathematicians refuse to accept as a valid form of demonstration – but it might turn out to be all there is. The same goes for Andrew Wiles’s proof of Fermat’s Last Theorem, first announced in 1993. The basic theorem is wonderfully simple and elegant, the proof anything but: 100 pages long and more complex than the Pompidou Centre. There’s no sign of anything simpler.
It’s not hard to mine science history for theories and proofs that were beautiful and wrong, or complicated and right. No one has ever shown a correlation between beauty and ‘truth’. But it is worse than that, for sometimes ‘beauty’ in the sense that many scientists prefer – an elegant simplicity, to put it in crude terms – can act as a fake trump card that deflects inquiry. In one little corner of science that I can claim to know reasonably well, an explanation from 1959 for why water-repelling particles attract when immersed in water (that it’s an effect of entropy, there being more disordered water molecules when the particles stick together) was so neat and satisfying that it continues to be peddled today, even though the experimental data show that it is untenable and that the real explanation probably lies in a lot of devilish detail.
I would be thrilled if the artist were to say to the scientist: ‘No, we’re not even on the same page’
Might it even be that the marvellous simplicity and power of natural selection strikes some biologists as so beautiful an idea – an island of order in a field otherwise beset with caveats and contradictions – that it must be defended at any cost? Why else would attempts to expose its limitations, exceptions and compromises still ignite disputes pursued with near-religious fervour?
The idea that simplicity, as distinct from beauty, is a guide to truth – the idea, in other words, that Occam’s Razor is a useful tool – seems like something of a shibboleth in itself. As these examples show, it is not reliably correct. Perhaps it is a logical assumption, all else being equal. But it is rare in science that all else is equal. More often, some experiments support one theory and others another, with no yardstick of parsimony to act as referee.
We can be sure, however, that simplicity is not the ultimate desideratum of aesthetic merit. Indeed, in music and visual art, there appears to be an optimal level of complexity below which preference declines. A graph of enjoyment versus complexity has the shape of an inverted U: there is a general preference for, say, ‘Eleanor Rigby’ over both ‘Baa Baa Black Sheep’ and Pierre Boulez’s Structures Ia, just as there is for lush landscapes over monochromes. For most of us, our tastes eschew the extremes.
I
ronically, the quest for a ‘final theory’ of nature’s deepest physical laws has meant that the inevitability and simplicity that Arkani-Hamed prizes so highly now look more remote than ever. For we are now forced to contemplate no fewer than 10500 permissible variants of string theory. It’s always possible that 10500 minus one of them might vanish at a stroke, thanks to the insight of some future genius. Right now, though, the dream of elegant fundamental laws lies in bewildering disarray.


An insistence that the ‘beautiful’ must be true all too easily elides into an empty circularity: what is true must therefore be beautiful. I see this in the conviction of many chemists that the periodic table, with all its backtracking sequences of electron shells, its positional ambiguities for elements such as hydrogen and unsightly bulges that the flat page can’t constrain, is a thing of loveliness. There, surely, speaks the voice of duty, not genuine feeling. The search for an ideal, perfect Platonic form of the table amid spirals, hypercubes and pyramids has an air of desperation.
Despite all this, I don’t want scientists to abandon their talk of beauty. Anything that inspires scientific thinking is valuable, and if a quest for beauty – a notion of beauty peculiar to science, removed from art – does that, then bring it on. And if it gives them a language in which to converse with artists, rather than standing on soapboxes and trading magisterial insults like C P Snow and F R Leavis, all the better. I just wish they could be a bit more upfront about the fact that they are (as is their wont) torturing a poor, fuzzy, everyday word to make it fit their own requirements. I would be rather thrilled if the artist, rather than accepting this unified pursuit of beauty (as Ian McEwan did), were to say instead: ‘No, we’re not even on the same page. This beauty of yours means nothing to me.’
If, on the other hand, we want beauty in science to make contact with aesthetics in art, I believe we should seek it precisely in the human aspect: in ingenious experimental design, elegance of theoretical logic, gentle clarity of exposition, imaginative leaps of reasoning. These things are not vital for a theory that works, an experiment that succeeds, an explanation that enchants and enlightens. But they are rather lovely. Beauty, unlike truth or nature, is something we make ourselves.


Philip Ball will be appearing in London on July 7 to talk about this article. Discounted tickets are available for Aeon readers. To be notified when tickets go on sale click here. This event is organised by The Browser in association with Aeon and Prospect Magazine.

Saturday, 12 January 2019

Neodarwinism's star witness defects.

Genetics and Epigenetics — New Problems for Darwinism
Evolution News @DiscoveryCSC


New findings in genetics and epigenetics are creating new problems for evolution. The simplistic version of neo-Darwinism expects all variation come from genetic mutations, which nature selects for fitness. Non-coding DNA was relegated to the junk pile — trash left over from natural selection, which favors DNA that codes for proteins. In a notion called subfunctionalization, copies of genes might be free to mutate and become new proteins, or decay into “pseudogenes,” one type of junk DNA. As usual, simplistic theories are often wrong. 

How Many Genes?

The Human Genome Project ended with a surprisingly low number of genes. But what if they missed some Researchers at Yale have been finding genes that were misidentified as non-protein coding due to the methods previous researchers used to annotate them. One of the newly identified genes, they say, plays a key role in the immune system. Are there others? 

The findings suggest many more protein-coding genes and functions may be discovered. “A large portion of important protein-coding genes have been missed by virtue of their annotation,” said first author Ruaidhri Jackson. Without vetting and identifying these genes, “we can’t fully understand the protein-coding genome or adequately screen genes for health and disease purposes.” 

The first sentence of their paper in Nature says, “The annotation of the mammalian protein-coding genome is incomplete.” They have identified a “large number of RNAs that were previously annotated as ‘non-protein coding,’” some of which are “potentially important transcripts” able to make protein. Restrictive methods in the past “may obscure the essential role of a multitude of previously undiscovered protein-coding genes.”

Epigenetics in Archaea

Does epigenetic inheritance and regulation work only in eukaryotes? No. Scientists at the University of Nebraska-Lincoln discovered that members of the “simple” kingdom of Archaea also have it. They watched microbes inherit extreme acid resistance in Yellowstone hot springs not through genetics, but through epigenetics.
    “The surprise is that it’s in these relatively primitive organisms, which we know to be ancient,” said Blum, Charles Bessey Professor of Biological Sciences at Nebraska. “We’ve been thinking about this as something (evolutionarily) new. But epigenetics is not a newcomer to the planet.”

The discovery “raises questions … about how both eukaryotes and archaea came to adopt epigenetics as a method of inheritance.” Now they have to confront whether an even earlier common ancestor had it, or whether it evolved twice. “It’s a really interesting concept from an evolutionary perspective,” said a doctoral student involved in the research. Critics of neo-Darwinism might describe those alternatives differently from just “interesting.” Ridiculous, perhaps, or falsifying.

Epigenetics in Plants

Briefly, a paper in  PNAS finds that “Partial maintenance of organ-specific epigenetic marks during plant asexual reproduction leads to heritable phenotypic variation.” Why do clones, with identical genomes, differ? The answer is epigenetics. 
   We found that phenotypic novelty in clonal progeny was linked to epigenetic imprints that reflect the organ used for regeneration. Some of these organ-specific imprints can be maintained during the cloning process and subsequent rounds of meiosis. Our findings are fundamental for understanding the significance of epigenetic variability arising from asexual reproduction and have significant implications for future biotechnological applications.
    
Non-Genetic Order

Here’s a cellular phenomenon that really is interesting, because it reveals a newly discovered structural order in the cell membrane. This structural order surely is inherited somehow, but may have little to do with genes. Biochemists had thought for a century that the inner space in the membrane is fluid and disordered, but techniques to probe that space have been difficult because the detergents used disrupt the membrane. Now, researchers at Virginia Commonwealth University, in conjunction with Nobel laureate Joachim Frank, used a new method without detergents. They were surprised — no, startled — to find an orderly hexagonal 3-D structure between the molecules in the lipid bilayer. Is there a reason for this orderly structure?
     Where earlier models had shown a fluid, almost structureless lipid layer — one often-cited research paper compared it to different weights of olive oil poured together — the VCU-led team was startled to find a distinct hexagonal structure inside the membrane. This led the researchers to propose that the lipid layer might act as both sensor and energy transducer within a membrane-protein transporter.

“The most surprising outcome is the high order with which lipid molecules are arranged, and the idea they might even cooperate in the functional cycle of the export channel,” said Joachim Frank, Ph.D., of Columbia University, a 2017 Nobel laureate in chemistry and co-author of the paper. “It is counterintuitive since we have learned that lipids are fluid and disordered in the membrane.”
    Their paper in PNAS says nothing about genetics, so maybe this comes about through physical interactions of the lipids and the protein channels. Whatever causes this orderly arrangement, it appears to interact with transmembrane channels, adapting to the conformational changes of the proteins, particularly a transporter called AcrB. Without the hexagonal mesh around the channel, and just a disordered fluid, the channels action might be less efficient, like a boxer without a sparring partner beating the air. Not only that, the hexagonal mesh also transmits the channel’s activity down the membrane to its neighbors. Fascinating!
       Through defined protein contacts, the lipid bilayer senses the conformational changes that occur in each TM [transmembrane] domain and then transduces effects of these changes through the lipid bilayer to neighboring protomers in a viscous interplay between cavity lipids and the AcrB trimer.
                 
Another Blow to the Central Dogma

Mauro Modesti gives his perspective on a new finding in Science, “A pinch of RNA spices up DNA repair.” The Central Dogma of genetics that views DNA as the master molecule controlling everything downstream, with no feedback, has been suffering since it was first taught the 1960s. In the same issue of Science, a paper reveals that RNA plays an essential role in DNA repair. What does this mean? Modesti explains,
                   Pryor et al. report the surprising discovery that ribonucleotides are frequently incorporated at broken DNA ends, which enhances repair. This important finding overturns the central dogma of molecular biology by demonstrating that transient incorporation of ribonucleotides in DNA has a biological function.

Genetic Determinism Lives On

The idea that humans are pawns of their genes has a long history, mostly negative. Genetic determinism undermines free will and character, giving people something physical to blame for their problems. Materialists continue the bad habit, though, as shown in this paper in Nature Scientific Reports, “A genetic perspective on the relationship between eudaimonic –and hedonic well-being.” The news from the University of Amsterdam puts it bluntly: “Discovery of first genetic variants associated with meaning in life.” But can something as psychological or even spiritual be reduced to genes? 

They checked DNA samples of 220,000 individuals, and had them answer a questionnaire. The genetic variants, they say, “are mainly expressed in the central nervous system, showing the involvement of different brain areas.” 

“These results show that genetic differences between people not only play a role in differences in happiness, but also in differences for in meaning in life. By a meaning in life, we mean the search for meaning or purpose of life.”

Did these researchers ever learn that correlation is not causation? Did they inspect their own genes? Did they answer a questionnaire, saying that they felt eudaimonia when proposing genetic determinism? Did their genes determine their own philosophy of mind? If so, then how can anyone believe them? What are universities teaching scientists these days?

Simplistic notions of neo-Darwinism seemed more plausible before new techniques uncovered the evidence of splendid design going on in cells. If the trend continues, 2019 will be a great year for intelligent design.

Sacred cows?

In Europe, Animal Rights Are Steamrolling Religious Freedom
Wesley J. Smith

As Western society secularizes, religious liberty is in danger of becoming passé. Increasingly, jurisdictions are enacting laws in furtherance of legitimate social considerations that, concomitantly, shrivel the freedom of religious believers to live according to their personal faith precepts.


Western Europe is leading the way. Belgium now requires all food animals be stunned before slaughter, which prevents their meat from being declared kosher or halal — hence edible — in accordance with the religious requirements of Judaism and Islam.

A Terrible Bind

Of course, such animal welfare laws are absolutely appropriate. But, until recently, comity was also preserved by allowing limited religious exemptions. Those accommodations are now systematically being removed. From the New York Times story:

Most countries and the European Union allow religious exceptions to the stunning requirement, though in some places — like the Netherlands, where a new law took effect last year, and Germany — the exceptions are very narrow. Belgium is joining Sweden, Norway, Iceland, Denmark and Slovenia among the nations that do not provide for any exceptions.

That puts observant Jews and Muslims in a terrible bind. They can have food shipped from elsewhere, which is more expensive. But what if other countries also ban such practices, or their home countries forbid the import of kosher or halal meat? Believers would be forced to choose between eating meat and violating their religious beliefs.

Some secularists would be just fine with that since these laws don’t infringe their own freedoms, while those who are anti-religious would delight in forcing such hard choices upon believers.

“The Law Is Above Religion”

Some non-religionists even presume to tell the faithful what their rules do and don’t require:
   Ann De Greef, director of Global Action in the Interest of Animals, a Belgian animal rights group, insisted that stunning does not conflict with kosher and halal doctrine, and “they could still consider it ritual slaughtering,” but the religious authorities refuse to accept that.

“They want to keep living in the Middle Ages and continue to slaughter without stunning — as the technique didn’t yet exist back then — without having to answer to the law,” she said. “Well, I’m sorry, in Belgium the law is above religion and that will stay like that.”

That kind of religious intolerance is only going to present in brighter hues going forward. There is great pressure, for example, to ban infant circumcision, a sacred and absolute requirement of Jews, also practiced as a religious duty by many Muslims. Efforts are also afoot to force doctors to participate in abortion and/or euthanasia — even when a doctor considers such acts to be a grievous sin materially impacting their own eternal destinies. I am sure readers can think of many other examples.

Freedom cannot be a one-way street. Steamrolling traditional believers’ faith values is a recipe for tearing society apart.

American healthcare on its deathbed?:Pros and cons.

How fish school humans on design.

Fish Teach Humans about Design
Evolution News & Views April 1, 2016 3:48 AM

Why do fish bob their heads back and forth as they swim? Is that wasted movement? Is it an inescapable consequence of undulatory motions during swimming? That's what many scientists used to think. What a team found out reminds us never to assume nature's methods are wasteful.

A new paper in Nature Communications summarizes the find: "Fish optimize sensing and respiration during undulatory swimming." That word optimize has design written all over it, especially when the fish optimizes three things at once:

Previous work in fishes considers undulation as a means of propulsion without addressing how it may affect other functions such as sensing and respiration. Here we show that undulation can optimize propulsion, flow sensing and respiration concurrently without any apparent tradeoffs when head movements are coupled correctly with the movements of the body. This finding challenges a long-held assumption that head movements are simply an unintended consequence of undulation, existing only because of the recoil of an oscillating tail. We use a combination of theoretical, biological and physical experiments to reveal the hydrodynamic mechanisms underlying this concerted optimization. [Emphasis added.]
Using a "bio-inspired physical model" with flow sensors, the team from Harvard and the University of Florida found that head bobbing actually improves swimming efficiency. Then they studied how the fish's lateral line sense improves with the resulting water flow. One might think that the extra motion of the head would confuse the lateral line sense, but the opposite is true.

We discovered that the motions associated with undulation can automatically enhance lateral line sensing on the head by minimizing self-generated stimuli. Fish move their heads in a way that minimizes pressure up to 50%, establishing a twofold greater sensitivity to an external stimulus than would otherwise be possible (Fig. 3a). At swimming speeds up to 2 L s−1, we found a heightened sensitivity around the anterior region of the head, which is where the majority of the encounters related to feeding and locomotion are initiated. We propose that during swimming, fish may not have to rely as extensively on the efferent system to distinguish between external and self-generated stimuli if they rotate their head in an appropriate phase with respect to side-to-side motion.
Simultaneously, this head motion increases the flow across the gills, enhancing respiration. This is the first time the coupling of motion with respiration has been demonstrated in fish like it has been with birds, horses and humans. Scientists used to view undulation and respiration as independent processes. No longer:

Here, we discover that fishes swimming with body undulations also show respiratory-locomotor coupling. Our pressure model reveals that undulation-generated pressures around the mouth and opercula oscillate dramatically. We found that fishes exploit these pressures by timing their respiratory movements accordingly, which likely minimizes the energetic cost of pumping the dense medium of water. High-speed, high-resolution video reveals that respiratory movements are tightly synchronized with head movements (Fig. 3b). When the pressure difference between the outside and inside of the mouth reaches 0.2 mm Hg, fishes open their mouth to allow water to flow in passively. Perhaps not coincidentally, this exact pressure difference is generated by the active buccal expansion of stationary fish. In this way, we hypothesize that swimming fishes exploit self-generated pressures to circumvent the work of buccal pumping.
This is really neat. The undulatory motion of swimming with the fins moves the head back and forth in phase such that the work of breathing is reduced, and the sensitivity of the lateral line is optimized. It's a three-for-one gain with no tradeoff in cost.

Life requires the successful, simultaneous execution of basic physiological functions. The coordination of these functions usually relies on distinct neural networks that run in parallel. Over the past several decades, a number of studies have demonstrated that the passive mechanical properties of the body can simplify individual functions, releasing them from the need for precise neural control. Here, we show that during aquatic axial undulation, head movements can allow seemingly disparate but fundamental functions to be coordinated simultaneously without tradeoffs.
Isn't evolution smart to pull this off? Actually, the authors didn't have much to say about evolution. Their only mention of evolution seems to falsify its expectations:

Given that the respiratory system is located in the head and the locomotory system is associated with the trunk, it is not unreasonable to assume that respiration and swimming would be decoupled. The contemporary view point is that the origin of the lung enabled respiratory-locomotor coupling to evolve in terrestrial animals.
Here, we discover that fishes swimming with body undulations also show respiratory-locomotor coupling....

But why would evolution optimize two or three things at once? Selection for traits can only act on immediate benefit from a random mutation. Like they say, "it is not unreasonable to assume" that selection for benefit in one trait would be independent of selection for other traits. The "contemporary view" may be that the lung "enabled" coupling, but if that were a useful idea, they would have said more about it. They didn't. We know from experience, however, that when engineers succeed in optimizing multiple things at once without tradeoffs, they win prizes and promotions for intelligent work.

More evidence that this work supports intelligent design is seen in their desire to imitate it. "The power of this simple control architecture is that it can be universally applied to any size and species of undulating fish, as well as to autonomous, underwater vehicles," they note. Yet the salmon seen bobbing their heads in Living Waters beat engineers to it. Engineers can just imitate what they see and win a design prize.

Non-Clogging Filters

Another case of intelligent design was announced in a second paper in Nature Communications. Biologists from the College of William and Mary liked this design so much, they immediately thought of how to apply it. Notice that the design is found in birds and mammals as well as fish.

Suspension-feeding fishes such as goldfish and whale sharks retain prey without clogging their oral filters, whereas clogging is a major expense in industrial crossflow filtration of beer, dairy foods and biotechnology products. Fishes' abilities to retain particles that are smaller than the pore size of the gill-raker filter, including extraction of particles despite large holes in the filter, also remain unexplained. Here we show that unexplored combinations of engineering structures (backward-facing steps forming d-type ribs on the porous surface of a cone) cause fluid dynamic phenomena distinct from current biological and industrial filter operations. This vortical cross-step filtration model prevents clogging and explains the transport of tiny concentrated particles to the oesophagus using a hydrodynamic tongue. Mass transfer caused by vortices along d-type ribs in crossflow is applicable to filter-feeding duck beak lamellae and whale baleen plates, as well as the fluid mechanics of ventilation at fish gill filaments.
A hydrodynamic tongue -- what a concept! Fish "engineer" previously unknown flow patterns to transport the particles they need into their esophagus. Those humpback whales seen in Living Waters use this technique as they gulp krill with their huge mouths, and the small tropical fish do it with their gills. The ducks in Flight: The Genius of Birds do it with their beaks. Who taught a fish, a duck, and a whale about fluid dynamics? It must have been natural selection. Tell us, please, how that came about:

In addition to the ecological and evolutionary relevance, these problems are of substantial interest to industrial filtration engineers who seek to reduce the major operating expenses associated with clogging.
One reads with bated breath for an evolutionary explanation that never comes.

As more than 30,000 fish species possess branchial arches that may form d-type ribs, potential vortex formation in the slots between branchial arches has substantial implications for the fluid dynamics of fish feeding and ventilation throughout ontogeny and evolution. Vortical cross-step filtration could be applicable to feeding in a diversity of fish species. In addition, many filtration structures involved in vertebrate suspension feeding are composed of d-type ribs in crossflow, including fish gill rakers, tadpole gill filters, bird beak lamellae and whale baleen plates, suggesting that principles of vortical cross-step filtration could have widespread application.
And that's it. That's all they have to say about evolution. This beautifully designed trait, so envied by engineers, is found in all these unrelated animals. How? Because "principles of vertical cross-step filtration" work, and are found all over the animal kingdom, they must have evolved. Does that make any sense?

It should be clear to anyone that intelligent design did the heavy lifting in both papers. Evolution played no role in the experimental setup, the explanation, or the application in either case. As usual, evolution only tags along in the role of post-hoc narrative gloss.

The 'talking ape' vs. Darwin

Language as an Evolutionary Conundrum
David Klinghoffer February 26, 2016 6:10 AM

In Chapter 10 of his new book Evolution: Still a Theory in Crisis, Michael Denton argues for the proposition that language and the higher intellectual faculties -- the gifts that uniquely make us human -- arose by saltation. In other words, they are gifts -- sudden ones. Denton's view, as he makes clear, has precedents reaching from Alfred Russel Wallace to linguist Noam Chomsky.

In a nice coincidence, Chomsky and MIT colleague Robert C. Berwick are just out with a book of their own, from MIT Press, provocatively titled Why Only Us: Language and Evolution. To be sure, Chomsky and Berwick are not advocates of Denton's structuralist take on the theory of intelligent design. Still, their own argument for language by saltation is not hard to reconcile with Denton's view.

The recognition that language poses a problem for Darwinian gradualism is presumably what makes linguist Vyvyan Evans uneasy about the book, which Dr. Evans reviews in New Scientist:

Their argument goes like this. As our capability for grammar is genetically programmed, and as no other species has language, it stands to reason that language emerged fairly suddenly, in one fell swoop, because of a random mutation. This is what the authors refer to as the "gambler's-eye view" in contrast to a "gene's-eye view" of evolution. The sudden appearance of language occurred perhaps no more than 80,000 years ago, just before modern humans engaged in an out-of-Africa dispersion.

A sudden "random mutation"...

But to be convinced by this, the reader has to swallow a number of sub-arguments that are debatable at best. For one thing, the authors presume the Chomskyan model of human language -- that the rudiments of human grammar (or syntax) are unlearnable without an innate knowledge of grammar. Its position seems less reasonable today that it once did.

Remember, as surly geneticist Dan Grauer formulates the 12th and final of his principles of new-Darwinism (Evolution News pointed this out yesterday), "Homo sapiens does not occupy a privileged position in the grand evolutionary scheme." A sudden gift, mutation, call it what you will, endowing our ancient ancestors alone with language is thus, on principle, to be disallowed. Language must be shared with other, non-human creatures. And so it is, Evans assures readers.

[R]esearch in primatology and animal behaviour suggests that some of the precursors for language do exist in other species, ranging from European starlings to chimpanzees -- with the latter using a sophisticated gestural form of communication in the wild. In fact, gesture may well have been the medium that incubated language until ancestral humans evolved the full-blown capacity for it.

Yet no one would confuse the most eloquent chimp "gestures" with modern sign language. That leaves in place the question of where language, whether communicating through hand or mouth, came from.

The "scientific consensus" cannot accept saltations of such a staggering kind:

Ultimately, Why Only Us is something of a curiosity. It takes a reverse engineering perspective on the question of how language evolved. It asks, what would language evolution amount to if the Chomskyan proposition of universal grammar were correct? The answer is language as a mutation that produces a phenotype well outside the range of variation previously existing in the population -- a macromutation. This flies in the face of the scientific consensus. Indeed, the book attempts to make a virtue of disagreeing with almost everyone on how language evolved.

Evans makes an interesting point. If the sudden mutation occurred in one person, it would provide no benefit since there would be no one to talk to. Did the "random mutation," the gift, then occur in a pair of individuals, living in the same time and place? Don't even think of going there. All parties to the argument are agreed on that. Evans:

The reader is asked to swallow the following unlikely implication of their logic: language didn't evolve for communication, but rather for internal thought. If language did evolve as a chance mutation, without precedent, then it first emerged in one individual. And what is the value of language as a communicative tool when there is no one else to talk to? Hence, the evolutionary advantage of language, once it emerged, must have been for something else: assisting thought.

For the spectator, it's not without pleasure to see evolutionists going at each other this way. Evans accuses Chomsky and Berwick of "reverse engineering" -- but more orthodox Darwinian "perspectives" do the very same thing. They assume the negation of the human exceptionalist view and impose that principle, as Evolution News suggested, on whatever is observed.

Every take on the origin of language that leaves the creative work entirely to one or more "random mutations" is doomed. We will be excerpting Denton's Chapter 10 in good time. Stay tuned.

Editor's note: Get your copy of Evolution: Still a Theory in Crisis now. For a limited time, you'll enjoy a 30 percent discount at CreateSpace by using the discount code QBDHMYJH.