Search This Blog

Saturday 10 November 2018

Someday we'll look back and laugh.

Our Hairlessness: Another Evolutionary Enigma Suggestive of Intelligent Design




Nina Jablonski, Penn State anthropologist and author of Skin: A Natural History, gives an interesting interview to CNN on how our presumed pre-human ancestors lost their fur. It's a bit of a puzzle because (per the conventional set of evolutionary assumptions and deductions) our ancestors were furry like chimps, and Jablonski reasons that they lost their fur so as to allow for improved sweating required by the innovation of our becoming excellent long-distance runners.
CNN: When did we first lose our fur and gain this pigmentation?

Jablonski: The human lineage evolved in Africa. If we start at a starting point of 6 to 7 million years ago, when humans first parted ways from the ancestors of chimpanzees, we have a lot of fossils that indicate that humans were walking on two legs, but they were not modern-looking. They were fairly short, and they still had quite ape-like body proportions: fairly long arms, relatively short legs. These were Australopithecus species of various kinds. They were good bipeds, but they were also capable tree-climbers. But when we look at their skeletons in detail, it's pretty clear that they were not active runners. They could walk on two legs but they weren't running or striding purposefully across the savanna most of the time, they were sort of living lives that are much like those of chimpanzees: fairly close to the edge of the forest, sometimes going into trees for protection, and then walking for short distances in the open to forage.

We hypothesize that, at that stage in our lineage's evolution, we still would have had quite a bit of body hair, because the reason we started to lose body hair is related to the need for controlling body heat.

It turns out that primates lose most of their heat through radiation from the surface of the body into the environment, and by evaporation of sweat. The hotter it is outside, the more important sweat becomes, especially if the animal is exercising vigorously and generating a lot of internal body heat. Internal body heat is good to a point, but you have to be able to liberate excess heat, otherwise your brain, organs and muscles get too hot.

Primates as a lineage almost exclusively use sweating for this purpose (versus other mechanisms such as panting). There have been a lot of hypotheses made about why we lost most of our body hair. And I definitely, and many colleagues of mine definitely are of the opinion -- based on the environmental, anatomical and genetic evidence at hand -- that we lost most of our body hair because of the needs of heat regulation.

But as Jablonski also points out, chimps beneath their fur have light-colored skin. Take away the fur and you've a light-colored animal that, in the hot African sun, would be extremely vulnerable to the damaging rays of the sun. So you need dark skin. But what would the evolutionary advantage of that be before the transition to going furless? None that's apparent. So which came first?
The running? But that requires the furless feature (not to mention a massive investment in tightly coordinated anatomical reengineering under the skin, as Ann Gauger discusses in Science and Human Origins). The furless feature then? But that requires the dark skin. OK so the dark skin came first? But that seems to somehow look forward to future usefulness before any evolutionary advantage comes into play, which in turn sounds dangerously teleological.
Darwinian evolution can't just put things like that in the bank, with a view toward their being helpful in some future stage of the evolving lineage. Such anticipation, on the other hand, is a hallmark of intelligent design as we're all familiar with it from daily life. Otherwise, blind Darwinian churning seems to have got very lucky in pulling off these three simultaneous innovations just at the right time, together. That sounds more like an illustration of design innovation, doesn't it?

Another day another missing link or business as usual in Darwinville

The Evolution of "Ida": Darwinius masillae Fossil Downgraded From Ancestor to Pet
Casey Luskin

A few months ago, "Ida" was sitting on top of the world. She'd been lauded as the "eighth wonder of the world" whose "impact on the world of palaeontology" would be like "an asteroid falling down to Earth." Falling, indeed. On October 21, Nature published an article announcing that "[a] 37-million-year-old fossil primate from Egypt, described today in Nature, moves a controversial German fossil known as Ida out of the human lineage." Wired also published a story, noting that, "[f]ar from spawning the ancestors of humans, the 47 million-year-old Darwinius seems merely to have gone extinct, leaving no descendants," further quoting a paleontologist calling Ida "a third cousin twice removed ... only very distantly related to living and fossil anthropoids."

But Ida was given quite a ride by the mainstream media, while it lasted. Originally:

Famed BBC broadcaster Sir David Attenborough got involved, making a documentary titled Uncovering Our Earliest Ancestor: The Link, to explain why Ida is "the link that connects us directly with the rest of the animal kingdom." Co-sponsored by both the BBC and the History Channel, the program attracted a massive audience. ...
Good Morning America and Nightline covered the fossil.
National Geographic called her the "critical 'missing link' species."
ScienceDaily and a Discover magazine commentator praised Ida as our "47-million-year-old human ancestor."
Skynews told the public that "proof of this transitional species finally confirms Charles Darwin's theory of evolution."
With Google's eager assistance, Ida went viral: One of the leading search terms that day was "missing link found." Even the Drudge Report was reeled in by the media frenzy, briefly featuring Ida as the headline story.
(Casey Luskin, "The Big Ida: The Rise & Fall of Another Missing Link & Other Media Hype," Salvo 10 (Autumn, 2009).)

scientificideas_sm.jpgIt only took a few months for Ida to go from celebrity-status "missing link" to just another extinct lower primate. As Nature is now reporting:
Teeth and ankle bones of the new Egyptian specimen show that the 47-million-year-old Ida, formally called Darwinius masillae, is not in the lineage of early apes and monkeys (haplorhines), but instead belongs to ancestors (adapiforms) of today's lemurs and lorises.
"Ida is as far away from the human lineage as you can get and still be considered a primate," says Christopher Beard, a palaeoanthropologist at the Carnegie Museum of Natural History in Pittsburgh, Pennsylvania, who was not involved in either research team.

(Rex Dalton, "Fossil primate challenges Ida's place," Nature, Vol. 461:1040 (October 21, 2009).)

The good news is that it seems that cooler heads are now prevailing regarding Ida. Wired notes that the current reporting about disagreements over Ida are an improvement, "the sort of dialogue that was missing from Darwinius' overhyped debut."
Where else have we seen an "overhyped debut" of a fossil, without "dialogue"? Exhibit A: "Ardi" (Ardipithecus ramidus).

In fact, with its article titled "Humanity Has New 4.4 Million-Year-Old Baby Mama," Wired was one of the numerous major media outlets assisting in the overhyped debut of Ardi. But most of those abettors didn't say anything about the ambiguity and dissent over Ardi's reconstructed skeleton. It seems that other missing links also debut with a lot of hype and without much dialogue.

Calm, collected, and careful scientific analysis is going on somewhere in the background here, but little scientific dissent from the media's storyline is being disclosed to the public. Instead, we see that the media, working with certain evangelistic tribes within the academy (see illustration at left), are unashamedly using these fossils as opportunities to push Darwin.

How long "Ardi" will retain favored link status is anyone's guess.

Sunday 28 October 2018

On the origins debate and Jehovah's transcendent technology.

Pearcey: “Stephen Hawking’s Final Salvo”

Before his death back in March, I found that every new prophetic statement from Stephen Hawking, invariably heralded by the media, was bound to make me sad. It seemed clear he was being exploited for his famous name to promote lugubrious causes having nothing to do with the genuine source of his scientific renown. And this still goes on, with Dr. Hawking having already passed on to the next world.

A new book is out under his name, Brief Answers to the Big Questions “published posthumously with material pulled from interviews, essays, speeches, and questions often asked of the famous physicist,” as Center for Science & Culture fellow  Nancy Pearcey writes at CNS News. She responds to the book’s most touted claim. The first chapter asks, “Is there a God?” Answer: No.

 “I think the universe was spontaneously created out of nothing, according to the laws of science.” After all, [Hawking] argues, “If you accept, as I do, that the laws of nature are fixed, then it doesn’t take long to ask: What role is there for God?”

An Open Cosmos

Pearcey replies:

Is Hawking right that scientific laws rule out any role for God? Despite being a brilliant physicist, he seemed unaware that his objection has already been answered — most famously by the popular literature professor C.S. Lewis, himself a former atheist, who taught at both Oxford and Cambridge University.

In his book Miracles,” Lewis concedes that, at first glance, the regularity of nature does seem to rule out the possibility that God is able to act into the world.

But not so fast. Natural laws tell us only what happens if nothing interferes. People playing a game of pool are applying the laws of physics, which decree that when a billiard ball hits another one, the second ball will start moving. But the laws do not tell what will happen if a mischievous child grabs the ball.

The laws are still true, of course, but the child has interfered with the physics.

Humans interfere with natural processes all the time, yet we do not break any laws of nature. We cut down trees to make houses, we weave plant fiber into cloth, we smelt metals to build bridges, we turn sand into silicon chips for computers. Through technology, we are constantly accomplishing things that nature on its own could not produce.

But do we break a single law of nature? No.

“All interferences leave the law perfectly true,” Lewis points out. And it’s the same when God acts in the world: He does not need to break any scientific laws. The cosmos is open to the actions of creative humans and a creator God.A better way to understand miracles, Lewis writes, is that they feed new events into the existing structure of nature: “The divine art of miracle is not an art of suspending the pattern to which events conform but of feeding new events into that pattern.”

Pearcey offers the helpful image of the universe not as an inviolable clockwork but as a musical instrument, implying the need for an artist’s hand in designing and creating the instrument and in playing it.

Dembski Expands on a Helpful Image

William Dembski has written about this in some detail. A musical instrument without someone to play it is incomplete:

Granted, if the universe is like a clockwork (cf. the design arguments of the British natural theologians), then it would be inappropriate for God, who presumably is a consummate designer, to intervene periodically to adjust the clock. Instead of periodically giving the universe the gift of “clock-winding and clock-setting,” God should simply have created a universe that never needed winding or setting. But what if instead the universe is like a musical instrument (cf. the design arguments of the Church Fathers, like Gregory of Nazianzus, who compared the universe to a lute — in this respect I much prefer the design arguments of the early Church to the design arguments of the British natural theologians)? Then it is entirely appropriate for God to interact with the universe by introducing design (or in this analogy, by skillfully playing a musical instrument). Change the metaphor from a clockwork to a musical instrument, and the charge of “withholding gifts” dissolves. So long as there are consummate pianists and composers, player-pianos will always remain inferior to real pianos. The incompleteness of the real piano taken by itself is therefore irrelevant here. Musical instruments require a musician to complete them. Thus, if the universe is more like a musical instrument than a clock, it is appropriate for a designer to interact with it in ways that affect its physical state.

The “clockwork” notion is one that ID critics adore because it makes design appear naïve and clumsy. If you look up intelligent design on Wikipedia, you’ll see that the editors have used the insides of a pocket watch to brand all their series of (highly misleading) articles on ID. If it were actually possible to edit Wikipedia, I’d substitute a different image; among musical instruments, a lute seems as good a choice as any.


How our Star sheds light on the origins debate

Denton Turns Sagan’s “Humdrum Star” on Its Head

On a new ID the Future episode, biologist Michael Denton talks with host Sarah Chaffee about the remarkable fitness of a range of properties seen in water and in light. Download the podcast or listen to it here.t’s particularly satisfying to hear Denton turn a frequently heard dismissal of our Sun on its head.

How Silly the Old People Were

Carl Sagan in Cosmos gave the idea perhaps its most iconic expression. In Episode 7, he lectures to a classroom of Brooklyn schoolchildren about how it was once thought that Earth occupied an “important” place in the cosmos. Tut-tut, how silly the old people were, before we realized that our solar system is not at the center of things but way out on the edge of the galaxy, thereby guaranteeing (this leap doesn’t quite follow) our cosmic insignificance. He goes on to ponder:

For as long as there have been humans we have searched for our place in the cosmos. Where are we? Who are we? We find that we live on an insignificant planet of a humdrum star lost in a galaxy tucked away in some forgotten corner of a universe in which there are far more galaxies than people.

Yet the fact that the Sun is “humdrum” or “ordinary” may be the most extraordinary thing about it. Dr. Denton discusses the fortunate circumstances characterizing visual light, needed for vision and photosynthesis. He concludes:

The big deal is that being an ordinary star means that the vast majority of stars in the cosmos put out their energy in the visual and infrared area, bands of the electromagnetic spectrum. So being “ordinary,” what this means is that the universe — it’s somewhat ironic and it seems a bit counterintuitive — it means the universe is profoundly fit for biological systems like ourselves. The Sun is an ordinary star, and that’s a very big deal. The universe, as I describe it Children of Light, is flooded with the light of life.

A Double Dilemma

“Flooded with the light of life”: What a beautiful way of putting it. You could add that it creates a double dilemma for materialists. The universe is “flooded with the light of life,” in Denton’s apt phrase. Yet so far as we know, life on Earth is a singularity. This is certainly counter to Carl Sagan’s expectations. Yet it’s possible that Sagan, and  Stephen Hawking and others, may prove right about the cosmos being home to many forms of extraterrestrial life, including intelligent forms. Only time can tell. If Sagan was correct, then Denton’s “ironic” observation about light’s insanely special fitness for life, life like ours, really comes to the fore.Light was finely tuned for life, whether in a multitude of homes in the cosmos or in just one. Either way, we’re prompted to ask the same question: Finely tuned by whom? And why?

Mssrs. Denton and Berlinski v. Darwinism II

Listen: Why Does Darwinism Hang On?
Evolution News @DiscoveryCSC

On an episode of ID The Future, philosopher and author David Berlinski joins geneticist and researcher Michael Denton for continued discussion on the debate over Darwinian evolution. Evolution News editor David Klinghoffer asks: Why has the theory persisted? What weaknesses threaten its existence in the 21st century?


Listen in as Berlinski and Denton explain why the Darwinian mechanism is being widely questioned as a viable theory of the origin and development of life. As Berlinski puts it: “Applying Darwinian principles to problems of this level of complexity is like putting a Band-Aid on a wound caused by an atomic weapon. It’s just not going to work.”

Sunday 21 October 2018

And still even yet more on the real world's anti-Darwinian bias.

The Blink of an Eye — And More Wonders of Human Body Design
Evolution News @DiscoveryCSC

Darwinists insist on your body’s “poor design.” By contrast, a prediction of intelligent design for biology is that phenomena should appear more functionally complex and elegantly coordinated the closer one looks. Evolutionist cavils aside, this is certainly true for the human body.

Vision

We blink all day long, once about every 5 seconds. Why doesn’t the world go dark in those moments? How do we perceive a continuous image? “The brain seems to remember the percepts that have just happened,” an article in Science Dailysays. “Scientists have now identified a brain area that plays a crucial role in perceptual memory.” At the German Primate Center, they found the spot where visual information is stored briefly during each eye blink. “Our research shows that the medial prefrontal cortex calibrates current visual information with previously obtained information and thus enables us to perceive the world with more stability, even when we briefly close our eyes to blink,” says the lead author of a paper in Current Biology. This implies a process of calibration, interpolation, and calculation happening literally in the blink of an eye.

“Ghost images” can be detected by the human vision system, an article inLive Science says. These are not direct images that we are familiar with, but rather computed images arrived at by multiplying the light in each pixel from a projected scene onto a background scene. The product produces the ghost image, but it can only be perceived under the right circumstances.

Experiments with participants viewing a superimposed checkerboard on a background photo showed that the ability to see the ghost image is a function of the eye’s refresh rate. The photo only became visible when a single-pixel detector collected the light from each pixel and then fed it into a projector at the right speed. Reporter Stephanie Pappas says this is akin to the optical illusion of seeing a movie when the frame rate matches the eye’s refresh rate. 

The reason this works, [Daniele] Faccio said, is that the human eye has a slow refresh rate. It’s not unlike the reason that movies work: When images are flickered on the screen faster than this refresh rate, it creates the illusion of smooth movement.

The eye “is very fast in acquiring the information,” Faccio said. “It’s just very slow in getting rid of it.”

The researchers figured out that the flickering patterns remained in the eye’s “memory” for about 20 milliseconds, slowly fading out over that time. If the 20-millisecond patterns overlap, the eye sums them up like a movie, allowing the ghost image to emerge.

Although ghost images are unlikely to appear in natural settings, the experiments provide a new way for neuroscientists to understand vision. For design advocates, they open windows into the moment-by-moment calculations that our eyes and brain have to perform to give us a smooth picture.

Olfaction

What we perceive as odors are collections of molecules. Why similar molecules produce very different perceptions of smell has long been a puzzle to physiologists, making odor classification difficult. Is there a way to classify odors, the way audible tones can be classified by frequency?The Salk Institute looked into this “unsolved problem,” hoping to find a pattern that might allow scientists to predict how a molecule (or combination of molecules) would smell. They found that mapping molecules in 2D was too simplistic. Patterns emerged only when they mapped the molecules onto a hyperboloid, a shape similar to a Pringles potato chip.

When the team looked at how the molecules clustered on this surface, they found there were pleasant and unpleasant directions, as well as directions that correlated with acidity or how easily odors evaporate from surfaces. These observations now make it easier to construct pleasant odor mixtures to use, for example, in artificial environments (such as a space station).

The paper in Science Advances explains why this pattern makes sense in nature:

In the natural environment, the sense of smell, or olfaction, serves to detect toxins and judge nutritional content by taking advantage of the associations between compounds as they are created in biochemical reactions. This suggests that the nervous system can classify odors based on statistics of their co-occurrence within natural mixtures rather than from the chemical structures of the ligands themselves.

Meanwhile, at the Monell Center in Philadelphia, scientists learned something new about “mysterious sensory cella” in the nose adjacent to the olfactory sensory neurons, called microvillous cells. “The findings suggest that the so-called microvillous cells (MVCs) may protect the vulnerable olfactory epithelium by detecting and initiating defenses against viruses, bacteria, and other potentially harmful invaders.” MVCs may also helped regenerate damaged cells in the olfactory epithelium. This suggests “multiple levels of protection in the airways,” says a co-author of the paper in  PLOS ONE that, fortunately, appears more interested in the function of these cells than their evolution. 

Mother’s Milk

Two news items show the benefits of human milk for newborns. Scientists at the University of Edinburgh find that “Breast milk may help babies’ brain development.” Pre-term babies showed better brain development when fed breast milk instead of formula. Brain scans were performed on 47 pre-term babies at 40 weeks that had been delivered at 33 weeks

The team also collected information about how the infants had been fed while in intensive care — either formula milk or breast milk from either the mother or a donor.

Babies who exclusively received breast milk for at least three-quarters of the days they spent in hospital showed improved brain connectivity compared with others.

The effects were greatest in babies who were fed breast milk for a greater proportion of their time spent in intensive care.

HealthDay  reports on a paper in the journal Pediatrics where scientists compared the “healthy weight trajectory” of babies who were fed directly from the breast, or by pumping, or by formula. Healthy weight gain was best for those babies with direct breast feeding, scientists found. “Researchers stressed that breast milk, in any form, is better than formula,” they said. “But they said the findings support the notion that the method of feeding matters, too.” Often this is difficult for mothers having to work. A healthy society should promote this natural function of mother and baby.

Liver

The liver is appropriately named; it wants to live. It has an uncanny ability to regenerate itself, and can regrow 70 percent of its mass in a few weeks if damaged, and function like new. Researchers at the University of Illinois wondered how it does that at the molecular level. The secret involves signaling and alternative splicing that puts the liver back into a neonatal state:

“We found that the liver cells after birth use a specific RNA-binding protein called ESRP2 to generate the right assortment of alternatively spliced RNAs that can produce the protein products necessary for meeting the functional demands of the adult liver,” said graduate student Sushant Bangru, the lead author of the study. “When damaged, the liver cells lower the quantity of ESRP2 protein. This reactivates fetal RNA splicing in what is called the ‘Hippo signaling pathway,’ giving it instructions about how to restore and repopulate the liver with new and healthy cells.”

Conclusions

Many non-living objects, such as crystals and metals, look the same practically all the way down. Living things, by contrast, become more wondrous as you zoom in from organism to organ to tissue to cell to nucleus. New imaging techniques are bringing wonders into focus that were unimaginable for most of human history. If ever there was a heyday for intelligent design.

Mssrs. Denton and Berlinski v. Darwinism.

Denton, Berlinski: Primary Objections to Neo-Darwinism
Evolution News @DiscoveryCSC

If you had an opportunity to confute and confound your Darwinist friends, but a limited amount of time in which to do it, what challenge would you put to them? Discovery Institute Senior Fellows David Berlinski and Michael Denton are both long-time critics of neo-Darwinism. On a classic episode of ID the Future, responding to this query from Evolution News editor David Klinghoffer, they discuss their primary objections to neo-Darwinian theory.

Download the podcast or listen to it here.

For Berlinski, a mathematician and author of The Deniable Darwin, the problem is quantitative and methodological. For Denton, a geneticist and author of the new Discovery Institute Press book Children of Light: The Astonishing Properties of Light that Make Us Possible, the problem is empirical. Don’t miss this engaging discussion.

Darwinian spin doctors' rose colored spectacles are of no help with the Cambrian explosion.

"Molecular Clock" Can't Save Darwin from the Cambrian Explosion
Evolution News & Views October 28, 2015 3:59 AM


Current Biology has published yet another attempt to explain away the Cambrian explosion. On reading certain parts, you might think the authors, including Maximilian Telford, Philip Donoghue, and Ziheng Yang, have solved the problem. Indeed, their first Highlight in the paper summary claims, "Molecular clock analysis indicates an ancient origin of animals in the Cryogenian." (Cryogenian refers to the Precambrian "cold birth" era about 720 to 635 million years ago.) By itself that statement would be very misleading, because the title of the open-access paper is pessimistic: "Uncertainty in the Timing of Origin of Animals and the Limits of Precision in Molecular Timescales."

Yang appeared briefly in Stephen Meyer's book Darwin's Doubt with bad news. Meyer cited a paper Yang co-authored with Aris-Brosou in 2011 showing that molecular clock analyses are unreliable. They "found that depending on which genes and which estimation methods were employed, the last common ancestor of protostomes or deuterostomes (two broadly different types of Cambrian animals) might have lived anywhere between 452 million years and 2 billion years ago" (Meyer, p. 106).

Nothing has changed since then. The bottom line after a lot of wrangling with numbers, strategies, and analyses is that all current methods of dating the ancestors of the Cambrian animals from molecular clocks are imprecise and uncertain. They cannot be trusted to diffuse the explosion by rooting the animal ancestors earlier in the Precambrian.

Although a Cryogenian origin of crown Metazoa agrees with current geological interpretations, the divergence dates of the bilaterians remain controversial. Thus, attempts to build evolutionary narratives of early animal evolution based on molecular clock timescales appear to be premature. [Emphasis added.]
Check out the euphemisms. Translated into plain English, it means, "We can't tell our favorite evolutionary story because the clock is broken, but we're working on it."

In this new paper, they provide the latest and greatest analysis of molecular clock data so far. It's clear they believe that all the data place the root of the divergence in the Ediacaran or earlier, 100 million years or more before the Cambrian, but can they really defend their belief? They have to admit severe empirical limits:

Here we use an unprecedented amount of molecular data, combined with four fossil calibration strategies (reflecting disparate and controversial interpretations of the metazoan fossil record) to obtain Bayesian estimates of metazoan divergence times. Our results indicate that the uncertain nature of ancient fossils and violations of the molecular clock impose a limit on the precision that can be achieved in estimates of ancient molecular timescales.
Perhaps, a defender might interrupt, the precision, admittedly limited, is good enough. But then, there are those pesky fossils! The molecular clocks are fuzzily in agreement about ancestors in the Precambrian, but none of them have support from the very best observational evidence: the record of the rocks. Even the phyla claimed to exist before the explosion are contested:

Unequivocal fossil evidence of animals is limited to the Phanerozoic [i.e., the modern eon from Cambrian to recent, where animals are plentiful]. Older records of animals are controversial: organic biomarkers indicative of demosponges are apparently derived ultimately from now symbiotic bacteria; putative animal embryo fossils are alternately interpreted as protists; and contested reports of sponges, molluscs, and innumerable cnidarians, as well as putative traces of eumetazoan or bilaterian grade animals, all from the Ediacaran. Certainly, there are no unequivocal records of crown-group bilaterians prior to the Cambrian, and robust evidence for bilaterian phyla does not occur until some 20 million years into the Cambrian.
This severely limits their ability to "calibrate" the molecular clock. Meyer granted the possible existence of three Precambrian phyla (sponges, molluscs, and cnidarians). But there are twenty other phyla that make their first appearance in the Cambrian, many of them far more complex than sponges. What good are the molecular methods if you can't see any of the ancestors in the rocks?

The authors admit that the Precambrian strata were capable of preserving the ancestors if they existed.

No matter how imprecise, our timescale for metazoan diversification still indicates a mismatch between the fossil evidence used to calibrate the molecular clock analyses and the resulting divergence time estimates. This is not altogether surprising since, by definition, minimum constraints of clade ages anticipate their antiquity. Nevertheless, it is the extent of this prehistory that is surprising, particularly since the conditions required for exceptional fossil preservation, so key to evidencing the existence of animal phyla in the early Cambrian, obtained also in the Ediacaran.
The only way they can maintain their belief that the ancestors are way back earlier is to discount the fossil evidence as "negative evidence" and to put their trust in the molecular evidence. But how can they trust it, when the answers vary all over the place, depending on the methods used? One clever method is called "rate variation." Would you trust a clock that has a variable rate? How about one fast-ticking clock for one animal, and a slow-ticking clock for another?

When rate variation across a phylogeny is extreme (that is, when the molecular clock is seriously violated), the rates calculated on one part of the phylogeny will serve as a poor proxy for estimating divergence times in other parts of the tree. In such instances, divergence time estimation is challenging and the analysis becomes sensitive to the rate model used.
They try their trees with steady rates and with varying rates ("relaxed clock models" -- amusing term). They try data partitioning. They try Bayesian analysis. None of them agree. Meyer discussed molecular clock problems in detail in Chapter 5 of Darwin's Doubt. There's nothing new here. "Here we show that the precision of molecular clock estimates of times has been grossly over-estimated," they conclude. "....An evolutionary timescale for metazoan diversification that accommodates these uncertainties has precision that is insufficient to discriminate among causal hypotheses." In the end, these evolutionists have to admit that fossils would be much, much better:

Above all, establishing unequivocal evidence for the presence of metazoan clades in the late Neoproterozoic, as well as for the absence in more ancient strata, will probably have more impact than any methodological advance in improving the accuracy and precision of divergence time estimates for deep metazoan phylogeny. Realizing the aim of a timescale of early animal evolution that is not merely accurate, but sufficiently precise to effect tests of hypotheses on the causes and consequences of early animal evolution, will require improved models of trait evolution and improved algorithms to allow analysis of genome-scale sequence data in tandem with morphological characters.
Wait a minute; isn't that what Darwin provided? -- a model of trait evolution? Wasn't it natural selection of gradual variations? Let's parse this interesting quote that mentions Darwin:

The timing of the emergence of animals has troubled evolutionary biologists at least since Darwin, who was sufficiently incredulous that he considered the abrupt appearance of animal fossils in the Cambrian as a challenge to his theory of evolution by natural selection. There has been, as a result, a long history of attempts to rationalize a rapid radiation of animals through theories of non-uniform evolutionary processes, such as homeotic mutations, removal of environmental restrictions on larger body sizes, through to the assembly of gene regulation kernels --- proposed both as an explanation for rapid rates of innovation followed by subsequent constraint against fundamental innovation of new body plans after the Cambrian. Indeed, there have been explicit attempts to accommodate rapid rates of phenotypic evolution in the early Cambrian, compatible with these hypotheses and a semi-literal (albeit phylogenetically constrained) reading of the fossil record.
And yet our results, as have others before them, suggest that there is no justification for invoking non-uniform mechanisms to explain the emergence of animals and their phylum-level body plans.

That phrase "semi-literal (albeit phylogenetically constrained) reading of the fossil record" is curious. How else are you supposed to read it? They are saying that you have to read the fossil record with Darwin-colored glasses to see it correctly.
But they're trying to have it both ways. They want a slow-and-gradual fuse leading up to the Cambrian explosion (disliking "non-uniform evolutionary processes"), which requires a non-literal reading of the fossil record with Darwin glasses on, but they can't take the molecular data literally either, because it is so method-dependent. You can almost hear them crying out for fossils. As Meyer's book shows, the fossil record is more explosive now than it was in Darwin's time.

The Information Enigma Again

Notice how they mention "the emergence of animals and their phylum-level body plans." How do you get the information to build a phylum-level body plan? Once again, these authors ignore the information issue completely. They say, "Much of the molecular genetic toolkit required for animal development originated deep in eukaryote evolutionary history," skirting past that with a lateral reference to a paper about a microbe that had no animal body plan. Talk of "emergence" just doesn't cut it. What is the source of the information to build an animal body plan composed of multiple new cell types and tissues, with 3-D organization and integrated systems like sensory organisms, locomotion and digestive tracts? Is there an evolutionist who will please answer Meyer's primary challenge?

As we've seen over and over again, many Darwinian evolutionists think they have done their job if they can just push the ancestry back in time. The fossil record doesn't allow it, but even if it did, it wouldn't solve the information problem. Calling it "emergence" is unsatisfactory. Calling it "innovation" is unsatisfactory. Calling it latent potential waiting for environmental factors like heat or oxygen is unsatisfactory. Answer the question: what is the source of the information to build twenty new animal body plans that appeared suddenly in the Cambrian without ancestors? We have an answer: intelligence. What's yours?

Sunday 14 October 2018

A clash of Titans. LXXVIII

A clash of Titans.LXXVII

Why Darwinism's crisis continues.

Education v. indoctrination.

Evolution Education — A Debater’s Perspective
Sarah Chaffee

I write a lot here about critical thinking in evolution education. Now, I want to address teaching the controversy from a pedagogical viewpoint. That is, I’m not going to touch on the scientific controversy over biological evolution. What I want to address is why one should teach evolution, or any subject, through critical thinking and not dogmatically. 


I’m a debate coach. I began competing in 2005 and have been involved in the competitive forensics world ever since. When I first read it, Discovery Institute’s Science Education Policy got my attention. It notes, in part: 

Discovery Institute seeks to increase the coverage of evolution in curriculum. It believes that evolution should be fully and completely presented to students, and they should learn more about evolutionary theory, including its unresolved issues. In other words, evolution should be taught as a scientific theory that is open to critical scrutiny, not as a sacred dogma that can’t be questioned. 

What, regardless of your viewpoint on evolution, does this style of teaching have to offer? What are the bonuses that such an approach brings with it?

Three Significant Benefits

First, the practice of critical analysis is just plain more interesting. Contrasting opposing viewpoints engages people, whether young or old. Defending a certain position in front of others develops curiosity. To simulate the mind, there is nothing quite like researching an issue, knowing you know it, and being ready to explain it to others. 

Second, critical thinking enables students to learn more. Debaters can easily spend an hour or two a day researching a topic. Compare this to your average university course — would a student spend that kind of time studying apart from completing required homework? 

One year, my debate topic centered on the Fourth Amendment. Now, I generally would have no inclination to spend hours and hours reading decisions from circuit courts and the Supreme Court — but I was excited about it because of debate.

Third, exposing an issue to critical scrutiny brings in elements of persuasion and public speaking. Even if we’re talking about a classroom setting rather than a debate round, students will be interested in raising questions and defending positions. Analysis allows students to have an opportunity to try to persuade others and to raise issues in front of a group. Not unlike in school sports, a healthy instinct for competition comes in. Learning through passive memorization and regurgitation has nothing to compare to that.

William Butler Yeats noted, “Education is not the filling of a pail, but the lighting of a fire.” Yes, filling a pail or lighting a fire: when it comes to evolution and many other subjects, that’s exactly the choice educators face.


On separating science from philosophy.

Atheist Fundamentalism and the Limits of Science
Michael Egnor October 30, 2007 4:36 PM

Juno Walker at Letters from Vrai has responded to my post Dr. Pigliucci and Fundamentalism in Science Education. Dr Massimo Pigliucci published an essay in The McGill Journal of Education in which he made the absurd claim that effective science education would dissuade students from a belief in Heaven. I pointed out in my post that Heaven wasn't exactly a proper subject for the scientific method and that the assertion that science education was even applicable to a belief in Heaven was fundamentalism -- a kind of atheist fundamentalism. The conflation of methodological naturalism and philosophical naturalism -- science and atheism -- is no more acceptable pedagogy than the conflation of science and creationism. Atheism and creationism are philosophical inferences, and, irrespective of the truth of either faith, neither is consistent with the scientific method. The scientific method -- methodological naturalism -- is the data-driven study of nature. It's based on natural, not supernatural, claims. The irony is that the McGill Journal of Education published Dr. Pigliucci's atheist broadsheet for fundamentalism in science education, but would never publish a creationist broadsheet for fundamentalism in science education.

Walker cites Darwinist philosopher Barbara Forrest to defend the assertion that atheism is a scientifically justifiable inference. Dr. Forrest:
...the relationship between methodological and philosophical naturalism, while not one of logical entailment, is the only reasonable metaphysical conclusion, given (1) the demonstrated success of methodological naturalism, combined with (2) the massive amount of knowledge gained by it, (3) the lack of a method or epistemology for knowing the supernatural, and (4) the subsequent lack of evidence for the supernatural. The above factors together provide solid grounding for philosophical naturalism, while supernaturalism remains little more than a logical possibility.
Dr. Forrest is mistaken. The demonstrated success of methodological naturalism has no bearing on the truth or falsehood of philosophical naturalism, because the assertion of philosophical naturalism (there are no extra-natural things) is outside the purview of methodological naturalism (the study of natural things). Methodological naturalism is defined by its inability to adjudicate extra-natural questions.
Dr. Forrest's claim (3) that philosophical naturalism must be true because of "the lack of a method or epistemology for knowing the supernatural" is nonsense. The methods for knowing the supernatural are by definition beyond the scope of methodological naturalism and are properly philosophical methods, not scientific methods. Forrest's implicit assertion that there is no philosophical "method or epistemology for knowing the supernatural" is an assertion that two and a half millenia of Western philosophy don't exist. What of Platonic Forms, Thomist proofs for the existence of God, Anslem's and Descartes' and Plantinga's Ontological Arguments, and Kant's Argument From Morality? It's safe to say that most of Western philosophy addresses issues that transcend our direct experience of the natural world. Ironically, Forrest's use of the scientific method to assert that the supernatural world doesn't exist employs one of the few philosophical methodologies that can't address questions outside of the natural world.

Methodological naturalism -- the scientific method -- precludes all extra-natural philosophical constraints on interpretation of physical data. That's the point of methodological naturalism -- the method of data collection and interpretation must be without extra-natural assumptions. Colloquially, methodological naturalism is 'following the physical evidence, unencumbered by extra-natural inference.' The design inference is based on evidence about the natural world. It is a violation of methodological naturalism to categorically exclude the design inference based on the postulate that the supernatural does not exist. The scientific method hews to evidence, not to philosophical dogma.

The approach to science in the era before the scientific method, much like the approach of atheists and Darwinists today, was to apply a priori philosophical constraints to the study of natural phenomena. The ancients modeled planetary motion as perfect circles because of the philosophical assumption that heavenly bodies must move 'perfectly,' and non-circular motion was considered imperfect and thus impermissible. Johannes Kepler's laws of elliptical planetary motion were an early triumph of the scientific method because Kepler discarded philosophical dogma and considered only the evidence. Of course, Kepler was a devout Christian (as were nearly all Enlightenment scientists), and he interpreted the laws of planetary motion as God's geometrical plan for the universe. Philosophical constraints -- a priori constraints -- on interpretation of data are inconsistent with the scientific method, but philosophical reflection on the data isn't. Newton derived his laws of motion from mathematical considerations and from data, yet he believed that the fabric of space and time in which the laws acted was the mind of God. Philosophical reflection on scientific data -- including reflection on supernatural causation -- has a long and quite honorable history.

So what of Forrest's fourth claim: that the truth of philosophical naturalism is supported by "the subsequent lack of evidence for the supernatural"? It's a bizarre inference, as divorced from empirical evidence as could be imagined. The past several centuries of Western science have revealed a universe created ex nihilo, governed by astonishingly intricate mathematical laws accessible to the human mind and characterized by properties of forces and energy and matter so closely tied to the existence of human life that cosmologists have had to invoke the existence of countless other universes to elide the anthropic implications. Life itself depends on a code -- remarkably like a computer language -- to produce, run and replicate cellular components that are themselves best described as intricate nanotechnology.

Here's the atheist interpretation of this scientific evidence: atheism is the only permissible explanation. Atheists are entitled to their opinion, but they have no business teaching students that atheist fundamentalism defines the limits of science.

Mary Shelley is seeming more and more oracular.

With Gene Editing, Scientists Perilously Push Borders of Biotechnology

The gene editing technology CRISPR and other biological laboratory manipulations have been used to manufacture mice with two biological fathers and two biological mothers. From the STAT story:

For the first time, scientists said Thursday that they had bred mice with two genetic fathers, steering around biological hurdles that would otherwise prevent same-sex parents from having offspring.

The researchers also bred mouse pups with two genetic mothers. Those pups matured into adults and had pups of their own, outpacing previous efforts to create so-called bimaternal mice.

“This research shows us what’s possible,” Wei Li, a senior author ofthe study, said in a statement. Li conducted the work with colleagues at the Chinese Academy of Sciences.

Portentous Technologies

Such manipulations, if ever done in humans, could a profound impact on human society going down the generations.

Beyond the technical, legal, and ethical roadblocks that would prevent this type of research in people, experts pointed to another concern. If researchers created, say, a daughter from two mothers or two fathers, and if she were healthy and had children of her own, it is unknown what genetic ramifications might be passed onto the next generation.

These are extremely portentous technologies. But existing laws and regulations that govern the sector — which were created when our scientific prowess was less sophisticated — are quickly becoming inadequate in ensuring that proper parameters are maintained to guide the development and direction of what I believe will become most powerful technologies ever invented.

Relying on voluntary ethical guidelines created by scientists to maintain proper ethical and safety boundaries — pretty much the situation now beyond some public funding limitations — is not a policy. It is an abdication of public responsibility.


Look: When scientists split the atom, our leaders did not just sit around slack-jawed and let the sector develop as it would. They engaged. They created laws, regulations, and international protocols to govern our use of atomic energy to maximize the benefit and reduce the danger. Surely we should do no less with biotechnology, which will have far more profound and far reaching impacts on human history.