Search This Blog

Sunday 28 October 2018

On the origins debate and Jehovah's transcendent technology.

Pearcey: “Stephen Hawking’s Final Salvo”

Before his death back in March, I found that every new prophetic statement from Stephen Hawking, invariably heralded by the media, was bound to make me sad. It seemed clear he was being exploited for his famous name to promote lugubrious causes having nothing to do with the genuine source of his scientific renown. And this still goes on, with Dr. Hawking having already passed on to the next world.

A new book is out under his name, Brief Answers to the Big Questions “published posthumously with material pulled from interviews, essays, speeches, and questions often asked of the famous physicist,” as Center for Science & Culture fellow  Nancy Pearcey writes at CNS News. She responds to the book’s most touted claim. The first chapter asks, “Is there a God?” Answer: No.

 “I think the universe was spontaneously created out of nothing, according to the laws of science.” After all, [Hawking] argues, “If you accept, as I do, that the laws of nature are fixed, then it doesn’t take long to ask: What role is there for God?”

An Open Cosmos

Pearcey replies:

Is Hawking right that scientific laws rule out any role for God? Despite being a brilliant physicist, he seemed unaware that his objection has already been answered — most famously by the popular literature professor C.S. Lewis, himself a former atheist, who taught at both Oxford and Cambridge University.

In his book Miracles,” Lewis concedes that, at first glance, the regularity of nature does seem to rule out the possibility that God is able to act into the world.

But not so fast. Natural laws tell us only what happens if nothing interferes. People playing a game of pool are applying the laws of physics, which decree that when a billiard ball hits another one, the second ball will start moving. But the laws do not tell what will happen if a mischievous child grabs the ball.

The laws are still true, of course, but the child has interfered with the physics.

Humans interfere with natural processes all the time, yet we do not break any laws of nature. We cut down trees to make houses, we weave plant fiber into cloth, we smelt metals to build bridges, we turn sand into silicon chips for computers. Through technology, we are constantly accomplishing things that nature on its own could not produce.

But do we break a single law of nature? No.

“All interferences leave the law perfectly true,” Lewis points out. And it’s the same when God acts in the world: He does not need to break any scientific laws. The cosmos is open to the actions of creative humans and a creator God.A better way to understand miracles, Lewis writes, is that they feed new events into the existing structure of nature: “The divine art of miracle is not an art of suspending the pattern to which events conform but of feeding new events into that pattern.”

Pearcey offers the helpful image of the universe not as an inviolable clockwork but as a musical instrument, implying the need for an artist’s hand in designing and creating the instrument and in playing it.

Dembski Expands on a Helpful Image

William Dembski has written about this in some detail. A musical instrument without someone to play it is incomplete:

Granted, if the universe is like a clockwork (cf. the design arguments of the British natural theologians), then it would be inappropriate for God, who presumably is a consummate designer, to intervene periodically to adjust the clock. Instead of periodically giving the universe the gift of “clock-winding and clock-setting,” God should simply have created a universe that never needed winding or setting. But what if instead the universe is like a musical instrument (cf. the design arguments of the Church Fathers, like Gregory of Nazianzus, who compared the universe to a lute — in this respect I much prefer the design arguments of the early Church to the design arguments of the British natural theologians)? Then it is entirely appropriate for God to interact with the universe by introducing design (or in this analogy, by skillfully playing a musical instrument). Change the metaphor from a clockwork to a musical instrument, and the charge of “withholding gifts” dissolves. So long as there are consummate pianists and composers, player-pianos will always remain inferior to real pianos. The incompleteness of the real piano taken by itself is therefore irrelevant here. Musical instruments require a musician to complete them. Thus, if the universe is more like a musical instrument than a clock, it is appropriate for a designer to interact with it in ways that affect its physical state.

The “clockwork” notion is one that ID critics adore because it makes design appear naïve and clumsy. If you look up intelligent design on Wikipedia, you’ll see that the editors have used the insides of a pocket watch to brand all their series of (highly misleading) articles on ID. If it were actually possible to edit Wikipedia, I’d substitute a different image; among musical instruments, a lute seems as good a choice as any.


How our Star sheds light on the origins debate

Denton Turns Sagan’s “Humdrum Star” on Its Head

On a new ID the Future episode, biologist Michael Denton talks with host Sarah Chaffee about the remarkable fitness of a range of properties seen in water and in light. Download the podcast or listen to it here.t’s particularly satisfying to hear Denton turn a frequently heard dismissal of our Sun on its head.

How Silly the Old People Were

Carl Sagan in Cosmos gave the idea perhaps its most iconic expression. In Episode 7, he lectures to a classroom of Brooklyn schoolchildren about how it was once thought that Earth occupied an “important” place in the cosmos. Tut-tut, how silly the old people were, before we realized that our solar system is not at the center of things but way out on the edge of the galaxy, thereby guaranteeing (this leap doesn’t quite follow) our cosmic insignificance. He goes on to ponder:

For as long as there have been humans we have searched for our place in the cosmos. Where are we? Who are we? We find that we live on an insignificant planet of a humdrum star lost in a galaxy tucked away in some forgotten corner of a universe in which there are far more galaxies than people.

Yet the fact that the Sun is “humdrum” or “ordinary” may be the most extraordinary thing about it. Dr. Denton discusses the fortunate circumstances characterizing visual light, needed for vision and photosynthesis. He concludes:

The big deal is that being an ordinary star means that the vast majority of stars in the cosmos put out their energy in the visual and infrared area, bands of the electromagnetic spectrum. So being “ordinary,” what this means is that the universe — it’s somewhat ironic and it seems a bit counterintuitive — it means the universe is profoundly fit for biological systems like ourselves. The Sun is an ordinary star, and that’s a very big deal. The universe, as I describe it Children of Light, is flooded with the light of life.

A Double Dilemma

“Flooded with the light of life”: What a beautiful way of putting it. You could add that it creates a double dilemma for materialists. The universe is “flooded with the light of life,” in Denton’s apt phrase. Yet so far as we know, life on Earth is a singularity. This is certainly counter to Carl Sagan’s expectations. Yet it’s possible that Sagan, and  Stephen Hawking and others, may prove right about the cosmos being home to many forms of extraterrestrial life, including intelligent forms. Only time can tell. If Sagan was correct, then Denton’s “ironic” observation about light’s insanely special fitness for life, life like ours, really comes to the fore.Light was finely tuned for life, whether in a multitude of homes in the cosmos or in just one. Either way, we’re prompted to ask the same question: Finely tuned by whom? And why?

Mssrs. Denton and Berlinski v. Darwinism II

Listen: Why Does Darwinism Hang On?
Evolution News @DiscoveryCSC

On an episode of ID The Future, philosopher and author David Berlinski joins geneticist and researcher Michael Denton for continued discussion on the debate over Darwinian evolution. Evolution News editor David Klinghoffer asks: Why has the theory persisted? What weaknesses threaten its existence in the 21st century?


Listen in as Berlinski and Denton explain why the Darwinian mechanism is being widely questioned as a viable theory of the origin and development of life. As Berlinski puts it: “Applying Darwinian principles to problems of this level of complexity is like putting a Band-Aid on a wound caused by an atomic weapon. It’s just not going to work.”

Sunday 21 October 2018

And still even yet more on the real world's anti-Darwinian bias.

The Blink of an Eye — And More Wonders of Human Body Design
Evolution News @DiscoveryCSC

Darwinists insist on your body’s “poor design.” By contrast, a prediction of intelligent design for biology is that phenomena should appear more functionally complex and elegantly coordinated the closer one looks. Evolutionist cavils aside, this is certainly true for the human body.

Vision

We blink all day long, once about every 5 seconds. Why doesn’t the world go dark in those moments? How do we perceive a continuous image? “The brain seems to remember the percepts that have just happened,” an article in Science Dailysays. “Scientists have now identified a brain area that plays a crucial role in perceptual memory.” At the German Primate Center, they found the spot where visual information is stored briefly during each eye blink. “Our research shows that the medial prefrontal cortex calibrates current visual information with previously obtained information and thus enables us to perceive the world with more stability, even when we briefly close our eyes to blink,” says the lead author of a paper in Current Biology. This implies a process of calibration, interpolation, and calculation happening literally in the blink of an eye.

“Ghost images” can be detected by the human vision system, an article inLive Science says. These are not direct images that we are familiar with, but rather computed images arrived at by multiplying the light in each pixel from a projected scene onto a background scene. The product produces the ghost image, but it can only be perceived under the right circumstances.

Experiments with participants viewing a superimposed checkerboard on a background photo showed that the ability to see the ghost image is a function of the eye’s refresh rate. The photo only became visible when a single-pixel detector collected the light from each pixel and then fed it into a projector at the right speed. Reporter Stephanie Pappas says this is akin to the optical illusion of seeing a movie when the frame rate matches the eye’s refresh rate. 

The reason this works, [Daniele] Faccio said, is that the human eye has a slow refresh rate. It’s not unlike the reason that movies work: When images are flickered on the screen faster than this refresh rate, it creates the illusion of smooth movement.

The eye “is very fast in acquiring the information,” Faccio said. “It’s just very slow in getting rid of it.”

The researchers figured out that the flickering patterns remained in the eye’s “memory” for about 20 milliseconds, slowly fading out over that time. If the 20-millisecond patterns overlap, the eye sums them up like a movie, allowing the ghost image to emerge.

Although ghost images are unlikely to appear in natural settings, the experiments provide a new way for neuroscientists to understand vision. For design advocates, they open windows into the moment-by-moment calculations that our eyes and brain have to perform to give us a smooth picture.

Olfaction

What we perceive as odors are collections of molecules. Why similar molecules produce very different perceptions of smell has long been a puzzle to physiologists, making odor classification difficult. Is there a way to classify odors, the way audible tones can be classified by frequency?The Salk Institute looked into this “unsolved problem,” hoping to find a pattern that might allow scientists to predict how a molecule (or combination of molecules) would smell. They found that mapping molecules in 2D was too simplistic. Patterns emerged only when they mapped the molecules onto a hyperboloid, a shape similar to a Pringles potato chip.

When the team looked at how the molecules clustered on this surface, they found there were pleasant and unpleasant directions, as well as directions that correlated with acidity or how easily odors evaporate from surfaces. These observations now make it easier to construct pleasant odor mixtures to use, for example, in artificial environments (such as a space station).

The paper in Science Advances explains why this pattern makes sense in nature:

In the natural environment, the sense of smell, or olfaction, serves to detect toxins and judge nutritional content by taking advantage of the associations between compounds as they are created in biochemical reactions. This suggests that the nervous system can classify odors based on statistics of their co-occurrence within natural mixtures rather than from the chemical structures of the ligands themselves.

Meanwhile, at the Monell Center in Philadelphia, scientists learned something new about “mysterious sensory cella” in the nose adjacent to the olfactory sensory neurons, called microvillous cells. “The findings suggest that the so-called microvillous cells (MVCs) may protect the vulnerable olfactory epithelium by detecting and initiating defenses against viruses, bacteria, and other potentially harmful invaders.” MVCs may also helped regenerate damaged cells in the olfactory epithelium. This suggests “multiple levels of protection in the airways,” says a co-author of the paper in  PLOS ONE that, fortunately, appears more interested in the function of these cells than their evolution. 

Mother’s Milk

Two news items show the benefits of human milk for newborns. Scientists at the University of Edinburgh find that “Breast milk may help babies’ brain development.” Pre-term babies showed better brain development when fed breast milk instead of formula. Brain scans were performed on 47 pre-term babies at 40 weeks that had been delivered at 33 weeks

The team also collected information about how the infants had been fed while in intensive care — either formula milk or breast milk from either the mother or a donor.

Babies who exclusively received breast milk for at least three-quarters of the days they spent in hospital showed improved brain connectivity compared with others.

The effects were greatest in babies who were fed breast milk for a greater proportion of their time spent in intensive care.

HealthDay  reports on a paper in the journal Pediatrics where scientists compared the “healthy weight trajectory” of babies who were fed directly from the breast, or by pumping, or by formula. Healthy weight gain was best for those babies with direct breast feeding, scientists found. “Researchers stressed that breast milk, in any form, is better than formula,” they said. “But they said the findings support the notion that the method of feeding matters, too.” Often this is difficult for mothers having to work. A healthy society should promote this natural function of mother and baby.

Liver

The liver is appropriately named; it wants to live. It has an uncanny ability to regenerate itself, and can regrow 70 percent of its mass in a few weeks if damaged, and function like new. Researchers at the University of Illinois wondered how it does that at the molecular level. The secret involves signaling and alternative splicing that puts the liver back into a neonatal state:

“We found that the liver cells after birth use a specific RNA-binding protein called ESRP2 to generate the right assortment of alternatively spliced RNAs that can produce the protein products necessary for meeting the functional demands of the adult liver,” said graduate student Sushant Bangru, the lead author of the study. “When damaged, the liver cells lower the quantity of ESRP2 protein. This reactivates fetal RNA splicing in what is called the ‘Hippo signaling pathway,’ giving it instructions about how to restore and repopulate the liver with new and healthy cells.”

Conclusions

Many non-living objects, such as crystals and metals, look the same practically all the way down. Living things, by contrast, become more wondrous as you zoom in from organism to organ to tissue to cell to nucleus. New imaging techniques are bringing wonders into focus that were unimaginable for most of human history. If ever there was a heyday for intelligent design.

Mssrs. Denton and Berlinski v. Darwinism.

Denton, Berlinski: Primary Objections to Neo-Darwinism
Evolution News @DiscoveryCSC

If you had an opportunity to confute and confound your Darwinist friends, but a limited amount of time in which to do it, what challenge would you put to them? Discovery Institute Senior Fellows David Berlinski and Michael Denton are both long-time critics of neo-Darwinism. On a classic episode of ID the Future, responding to this query from Evolution News editor David Klinghoffer, they discuss their primary objections to neo-Darwinian theory.

Download the podcast or listen to it here.

For Berlinski, a mathematician and author of The Deniable Darwin, the problem is quantitative and methodological. For Denton, a geneticist and author of the new Discovery Institute Press book Children of Light: The Astonishing Properties of Light that Make Us Possible, the problem is empirical. Don’t miss this engaging discussion.

Darwinian spin doctors' rose colored spectacles are of no help with the Cambrian explosion.

"Molecular Clock" Can't Save Darwin from the Cambrian Explosion
Evolution News & Views October 28, 2015 3:59 AM


Current Biology has published yet another attempt to explain away the Cambrian explosion. On reading certain parts, you might think the authors, including Maximilian Telford, Philip Donoghue, and Ziheng Yang, have solved the problem. Indeed, their first Highlight in the paper summary claims, "Molecular clock analysis indicates an ancient origin of animals in the Cryogenian." (Cryogenian refers to the Precambrian "cold birth" era about 720 to 635 million years ago.) By itself that statement would be very misleading, because the title of the open-access paper is pessimistic: "Uncertainty in the Timing of Origin of Animals and the Limits of Precision in Molecular Timescales."

Yang appeared briefly in Stephen Meyer's book Darwin's Doubt with bad news. Meyer cited a paper Yang co-authored with Aris-Brosou in 2011 showing that molecular clock analyses are unreliable. They "found that depending on which genes and which estimation methods were employed, the last common ancestor of protostomes or deuterostomes (two broadly different types of Cambrian animals) might have lived anywhere between 452 million years and 2 billion years ago" (Meyer, p. 106).

Nothing has changed since then. The bottom line after a lot of wrangling with numbers, strategies, and analyses is that all current methods of dating the ancestors of the Cambrian animals from molecular clocks are imprecise and uncertain. They cannot be trusted to diffuse the explosion by rooting the animal ancestors earlier in the Precambrian.

Although a Cryogenian origin of crown Metazoa agrees with current geological interpretations, the divergence dates of the bilaterians remain controversial. Thus, attempts to build evolutionary narratives of early animal evolution based on molecular clock timescales appear to be premature. [Emphasis added.]
Check out the euphemisms. Translated into plain English, it means, "We can't tell our favorite evolutionary story because the clock is broken, but we're working on it."

In this new paper, they provide the latest and greatest analysis of molecular clock data so far. It's clear they believe that all the data place the root of the divergence in the Ediacaran or earlier, 100 million years or more before the Cambrian, but can they really defend their belief? They have to admit severe empirical limits:

Here we use an unprecedented amount of molecular data, combined with four fossil calibration strategies (reflecting disparate and controversial interpretations of the metazoan fossil record) to obtain Bayesian estimates of metazoan divergence times. Our results indicate that the uncertain nature of ancient fossils and violations of the molecular clock impose a limit on the precision that can be achieved in estimates of ancient molecular timescales.
Perhaps, a defender might interrupt, the precision, admittedly limited, is good enough. But then, there are those pesky fossils! The molecular clocks are fuzzily in agreement about ancestors in the Precambrian, but none of them have support from the very best observational evidence: the record of the rocks. Even the phyla claimed to exist before the explosion are contested:

Unequivocal fossil evidence of animals is limited to the Phanerozoic [i.e., the modern eon from Cambrian to recent, where animals are plentiful]. Older records of animals are controversial: organic biomarkers indicative of demosponges are apparently derived ultimately from now symbiotic bacteria; putative animal embryo fossils are alternately interpreted as protists; and contested reports of sponges, molluscs, and innumerable cnidarians, as well as putative traces of eumetazoan or bilaterian grade animals, all from the Ediacaran. Certainly, there are no unequivocal records of crown-group bilaterians prior to the Cambrian, and robust evidence for bilaterian phyla does not occur until some 20 million years into the Cambrian.
This severely limits their ability to "calibrate" the molecular clock. Meyer granted the possible existence of three Precambrian phyla (sponges, molluscs, and cnidarians). But there are twenty other phyla that make their first appearance in the Cambrian, many of them far more complex than sponges. What good are the molecular methods if you can't see any of the ancestors in the rocks?

The authors admit that the Precambrian strata were capable of preserving the ancestors if they existed.

No matter how imprecise, our timescale for metazoan diversification still indicates a mismatch between the fossil evidence used to calibrate the molecular clock analyses and the resulting divergence time estimates. This is not altogether surprising since, by definition, minimum constraints of clade ages anticipate their antiquity. Nevertheless, it is the extent of this prehistory that is surprising, particularly since the conditions required for exceptional fossil preservation, so key to evidencing the existence of animal phyla in the early Cambrian, obtained also in the Ediacaran.
The only way they can maintain their belief that the ancestors are way back earlier is to discount the fossil evidence as "negative evidence" and to put their trust in the molecular evidence. But how can they trust it, when the answers vary all over the place, depending on the methods used? One clever method is called "rate variation." Would you trust a clock that has a variable rate? How about one fast-ticking clock for one animal, and a slow-ticking clock for another?

When rate variation across a phylogeny is extreme (that is, when the molecular clock is seriously violated), the rates calculated on one part of the phylogeny will serve as a poor proxy for estimating divergence times in other parts of the tree. In such instances, divergence time estimation is challenging and the analysis becomes sensitive to the rate model used.
They try their trees with steady rates and with varying rates ("relaxed clock models" -- amusing term). They try data partitioning. They try Bayesian analysis. None of them agree. Meyer discussed molecular clock problems in detail in Chapter 5 of Darwin's Doubt. There's nothing new here. "Here we show that the precision of molecular clock estimates of times has been grossly over-estimated," they conclude. "....An evolutionary timescale for metazoan diversification that accommodates these uncertainties has precision that is insufficient to discriminate among causal hypotheses." In the end, these evolutionists have to admit that fossils would be much, much better:

Above all, establishing unequivocal evidence for the presence of metazoan clades in the late Neoproterozoic, as well as for the absence in more ancient strata, will probably have more impact than any methodological advance in improving the accuracy and precision of divergence time estimates for deep metazoan phylogeny. Realizing the aim of a timescale of early animal evolution that is not merely accurate, but sufficiently precise to effect tests of hypotheses on the causes and consequences of early animal evolution, will require improved models of trait evolution and improved algorithms to allow analysis of genome-scale sequence data in tandem with morphological characters.
Wait a minute; isn't that what Darwin provided? -- a model of trait evolution? Wasn't it natural selection of gradual variations? Let's parse this interesting quote that mentions Darwin:

The timing of the emergence of animals has troubled evolutionary biologists at least since Darwin, who was sufficiently incredulous that he considered the abrupt appearance of animal fossils in the Cambrian as a challenge to his theory of evolution by natural selection. There has been, as a result, a long history of attempts to rationalize a rapid radiation of animals through theories of non-uniform evolutionary processes, such as homeotic mutations, removal of environmental restrictions on larger body sizes, through to the assembly of gene regulation kernels --- proposed both as an explanation for rapid rates of innovation followed by subsequent constraint against fundamental innovation of new body plans after the Cambrian. Indeed, there have been explicit attempts to accommodate rapid rates of phenotypic evolution in the early Cambrian, compatible with these hypotheses and a semi-literal (albeit phylogenetically constrained) reading of the fossil record.
And yet our results, as have others before them, suggest that there is no justification for invoking non-uniform mechanisms to explain the emergence of animals and their phylum-level body plans.

That phrase "semi-literal (albeit phylogenetically constrained) reading of the fossil record" is curious. How else are you supposed to read it? They are saying that you have to read the fossil record with Darwin-colored glasses to see it correctly.
But they're trying to have it both ways. They want a slow-and-gradual fuse leading up to the Cambrian explosion (disliking "non-uniform evolutionary processes"), which requires a non-literal reading of the fossil record with Darwin glasses on, but they can't take the molecular data literally either, because it is so method-dependent. You can almost hear them crying out for fossils. As Meyer's book shows, the fossil record is more explosive now than it was in Darwin's time.

The Information Enigma Again

Notice how they mention "the emergence of animals and their phylum-level body plans." How do you get the information to build a phylum-level body plan? Once again, these authors ignore the information issue completely. They say, "Much of the molecular genetic toolkit required for animal development originated deep in eukaryote evolutionary history," skirting past that with a lateral reference to a paper about a microbe that had no animal body plan. Talk of "emergence" just doesn't cut it. What is the source of the information to build an animal body plan composed of multiple new cell types and tissues, with 3-D organization and integrated systems like sensory organisms, locomotion and digestive tracts? Is there an evolutionist who will please answer Meyer's primary challenge?

As we've seen over and over again, many Darwinian evolutionists think they have done their job if they can just push the ancestry back in time. The fossil record doesn't allow it, but even if it did, it wouldn't solve the information problem. Calling it "emergence" is unsatisfactory. Calling it "innovation" is unsatisfactory. Calling it latent potential waiting for environmental factors like heat or oxygen is unsatisfactory. Answer the question: what is the source of the information to build twenty new animal body plans that appeared suddenly in the Cambrian without ancestors? We have an answer: intelligence. What's yours?

Sunday 14 October 2018

A clash of Titans. LXXVIII

A clash of Titans.LXXVII

Why Darwinism's crisis continues.

Education v. indoctrination.

Evolution Education — A Debater’s Perspective
Sarah Chaffee

I write a lot here about critical thinking in evolution education. Now, I want to address teaching the controversy from a pedagogical viewpoint. That is, I’m not going to touch on the scientific controversy over biological evolution. What I want to address is why one should teach evolution, or any subject, through critical thinking and not dogmatically. 


I’m a debate coach. I began competing in 2005 and have been involved in the competitive forensics world ever since. When I first read it, Discovery Institute’s Science Education Policy got my attention. It notes, in part: 

Discovery Institute seeks to increase the coverage of evolution in curriculum. It believes that evolution should be fully and completely presented to students, and they should learn more about evolutionary theory, including its unresolved issues. In other words, evolution should be taught as a scientific theory that is open to critical scrutiny, not as a sacred dogma that can’t be questioned. 

What, regardless of your viewpoint on evolution, does this style of teaching have to offer? What are the bonuses that such an approach brings with it?

Three Significant Benefits

First, the practice of critical analysis is just plain more interesting. Contrasting opposing viewpoints engages people, whether young or old. Defending a certain position in front of others develops curiosity. To simulate the mind, there is nothing quite like researching an issue, knowing you know it, and being ready to explain it to others. 

Second, critical thinking enables students to learn more. Debaters can easily spend an hour or two a day researching a topic. Compare this to your average university course — would a student spend that kind of time studying apart from completing required homework? 

One year, my debate topic centered on the Fourth Amendment. Now, I generally would have no inclination to spend hours and hours reading decisions from circuit courts and the Supreme Court — but I was excited about it because of debate.

Third, exposing an issue to critical scrutiny brings in elements of persuasion and public speaking. Even if we’re talking about a classroom setting rather than a debate round, students will be interested in raising questions and defending positions. Analysis allows students to have an opportunity to try to persuade others and to raise issues in front of a group. Not unlike in school sports, a healthy instinct for competition comes in. Learning through passive memorization and regurgitation has nothing to compare to that.

William Butler Yeats noted, “Education is not the filling of a pail, but the lighting of a fire.” Yes, filling a pail or lighting a fire: when it comes to evolution and many other subjects, that’s exactly the choice educators face.


On separating science from philosophy.

Atheist Fundamentalism and the Limits of Science
Michael Egnor October 30, 2007 4:36 PM

Juno Walker at Letters from Vrai has responded to my post Dr. Pigliucci and Fundamentalism in Science Education. Dr Massimo Pigliucci published an essay in The McGill Journal of Education in which he made the absurd claim that effective science education would dissuade students from a belief in Heaven. I pointed out in my post that Heaven wasn't exactly a proper subject for the scientific method and that the assertion that science education was even applicable to a belief in Heaven was fundamentalism -- a kind of atheist fundamentalism. The conflation of methodological naturalism and philosophical naturalism -- science and atheism -- is no more acceptable pedagogy than the conflation of science and creationism. Atheism and creationism are philosophical inferences, and, irrespective of the truth of either faith, neither is consistent with the scientific method. The scientific method -- methodological naturalism -- is the data-driven study of nature. It's based on natural, not supernatural, claims. The irony is that the McGill Journal of Education published Dr. Pigliucci's atheist broadsheet for fundamentalism in science education, but would never publish a creationist broadsheet for fundamentalism in science education.

Walker cites Darwinist philosopher Barbara Forrest to defend the assertion that atheism is a scientifically justifiable inference. Dr. Forrest:
...the relationship between methodological and philosophical naturalism, while not one of logical entailment, is the only reasonable metaphysical conclusion, given (1) the demonstrated success of methodological naturalism, combined with (2) the massive amount of knowledge gained by it, (3) the lack of a method or epistemology for knowing the supernatural, and (4) the subsequent lack of evidence for the supernatural. The above factors together provide solid grounding for philosophical naturalism, while supernaturalism remains little more than a logical possibility.
Dr. Forrest is mistaken. The demonstrated success of methodological naturalism has no bearing on the truth or falsehood of philosophical naturalism, because the assertion of philosophical naturalism (there are no extra-natural things) is outside the purview of methodological naturalism (the study of natural things). Methodological naturalism is defined by its inability to adjudicate extra-natural questions.
Dr. Forrest's claim (3) that philosophical naturalism must be true because of "the lack of a method or epistemology for knowing the supernatural" is nonsense. The methods for knowing the supernatural are by definition beyond the scope of methodological naturalism and are properly philosophical methods, not scientific methods. Forrest's implicit assertion that there is no philosophical "method or epistemology for knowing the supernatural" is an assertion that two and a half millenia of Western philosophy don't exist. What of Platonic Forms, Thomist proofs for the existence of God, Anslem's and Descartes' and Plantinga's Ontological Arguments, and Kant's Argument From Morality? It's safe to say that most of Western philosophy addresses issues that transcend our direct experience of the natural world. Ironically, Forrest's use of the scientific method to assert that the supernatural world doesn't exist employs one of the few philosophical methodologies that can't address questions outside of the natural world.

Methodological naturalism -- the scientific method -- precludes all extra-natural philosophical constraints on interpretation of physical data. That's the point of methodological naturalism -- the method of data collection and interpretation must be without extra-natural assumptions. Colloquially, methodological naturalism is 'following the physical evidence, unencumbered by extra-natural inference.' The design inference is based on evidence about the natural world. It is a violation of methodological naturalism to categorically exclude the design inference based on the postulate that the supernatural does not exist. The scientific method hews to evidence, not to philosophical dogma.

The approach to science in the era before the scientific method, much like the approach of atheists and Darwinists today, was to apply a priori philosophical constraints to the study of natural phenomena. The ancients modeled planetary motion as perfect circles because of the philosophical assumption that heavenly bodies must move 'perfectly,' and non-circular motion was considered imperfect and thus impermissible. Johannes Kepler's laws of elliptical planetary motion were an early triumph of the scientific method because Kepler discarded philosophical dogma and considered only the evidence. Of course, Kepler was a devout Christian (as were nearly all Enlightenment scientists), and he interpreted the laws of planetary motion as God's geometrical plan for the universe. Philosophical constraints -- a priori constraints -- on interpretation of data are inconsistent with the scientific method, but philosophical reflection on the data isn't. Newton derived his laws of motion from mathematical considerations and from data, yet he believed that the fabric of space and time in which the laws acted was the mind of God. Philosophical reflection on scientific data -- including reflection on supernatural causation -- has a long and quite honorable history.

So what of Forrest's fourth claim: that the truth of philosophical naturalism is supported by "the subsequent lack of evidence for the supernatural"? It's a bizarre inference, as divorced from empirical evidence as could be imagined. The past several centuries of Western science have revealed a universe created ex nihilo, governed by astonishingly intricate mathematical laws accessible to the human mind and characterized by properties of forces and energy and matter so closely tied to the existence of human life that cosmologists have had to invoke the existence of countless other universes to elide the anthropic implications. Life itself depends on a code -- remarkably like a computer language -- to produce, run and replicate cellular components that are themselves best described as intricate nanotechnology.

Here's the atheist interpretation of this scientific evidence: atheism is the only permissible explanation. Atheists are entitled to their opinion, but they have no business teaching students that atheist fundamentalism defines the limits of science.

Mary Shelley is seeming more and more oracular.

With Gene Editing, Scientists Perilously Push Borders of Biotechnology

The gene editing technology CRISPR and other biological laboratory manipulations have been used to manufacture mice with two biological fathers and two biological mothers. From the STAT story:

For the first time, scientists said Thursday that they had bred mice with two genetic fathers, steering around biological hurdles that would otherwise prevent same-sex parents from having offspring.

The researchers also bred mouse pups with two genetic mothers. Those pups matured into adults and had pups of their own, outpacing previous efforts to create so-called bimaternal mice.

“This research shows us what’s possible,” Wei Li, a senior author ofthe study, said in a statement. Li conducted the work with colleagues at the Chinese Academy of Sciences.

Portentous Technologies

Such manipulations, if ever done in humans, could a profound impact on human society going down the generations.

Beyond the technical, legal, and ethical roadblocks that would prevent this type of research in people, experts pointed to another concern. If researchers created, say, a daughter from two mothers or two fathers, and if she were healthy and had children of her own, it is unknown what genetic ramifications might be passed onto the next generation.

These are extremely portentous technologies. But existing laws and regulations that govern the sector — which were created when our scientific prowess was less sophisticated — are quickly becoming inadequate in ensuring that proper parameters are maintained to guide the development and direction of what I believe will become most powerful technologies ever invented.

Relying on voluntary ethical guidelines created by scientists to maintain proper ethical and safety boundaries — pretty much the situation now beyond some public funding limitations — is not a policy. It is an abdication of public responsibility.


Look: When scientists split the atom, our leaders did not just sit around slack-jawed and let the sector develop as it would. They engaged. They created laws, regulations, and international protocols to govern our use of atomic energy to maximize the benefit and reduce the danger. Surely we should do no less with biotechnology, which will have far more profound and far reaching impacts on human history.

Saturday 13 October 2018

Freedom of speech must include freedom to offend?:Pros and Cons

Common descent v. Common design again.

From Ewert’s Dependency Graph Paper – A “Gut Punch” to Darwin’s Tree?

A “gut punch” to the Darwinian Tree of Life is the phrase tentatively applied by the Bradley Center’s Robert J. Marks to the new paper in BIO-Complexity by Winston Ewert, “The Dependency Graph of Life.” Dr. Marks kicks off a series of conversations with Dr. Ewert for ID the Future. I’m reminded again that Marks, among many other distinctions, was born to podcast. He’s really very good at it. His interview with Ewert is a winner, quite amusing and accessible, especially for such a potentially recondite subject. He really wants to help any listener understand what’s potentially “game-changing” about Ewert’s proposal. 


In the first episode they discuss the background of how Darwinian theory explains life’s nested hierarchy pattern, suggestive of the famed Tree. Conventional evolutionary thinking teaches that — despite many anomalies (e.g., echolocation popping up in bats and dolphins) — common descent is the only explanation that accounts for what we see. Sure it requires various ad hoc add-ons. But do you have anything better? Ewert may: not common descent but common design.


And what is a dependency graph? That’s to be the subject of their next conversation. I believe this is going to be “Dependency Graph Without Tears,” and not a moment too soon.

Sunday 7 October 2018

I.D is already mainstream? III

Space Archaeology — How About Cellular Archaelogy.

From Abraham Loeb, chairman of the Astronomy Department at Harvard University, founding director of Harvard’s Black Hole Initiative and director of the Institute for Theory and Computation at the Harvard-Smithsonian Center for Astrophysics, in Scientific American:

How to Search for Dead Cosmic Civilizations

If they’re short-lived, we might be able to detect the relics and artifacts they left behind

The possibility [is intriguing] that we will find technological relics flying through our solar system with no detectable functionality, such as pieces of equipment that lost power over the millions of years of their travel and have turned into space junk.

How much debris exist in interstellar space would depend on the abundance of technological civilizations and the scope of their aspirations for space exploration… there might be plenty of relics out there in the Milky Way for us to explore.

Wow. Arthur C. Clark has a great novel — Rendezvous with Rama — about a similar scenario. Loved it. 

This opportunity establishes a potential foundation for a new frontier of space archaeology, namely the study of relics from past civilizations in space. Instead of using shovels to dig into the ground, this new frontier will be explored by using telescopes to survey the sky and dig into space.

In Search of Intelligent Design

Space archaeology — a  fascinating and important approach to space exploration. It’s a careful analysis of objects to search for evidence for intelligent origin, and it need not be archeological, in the sense that the designing intelligence(s) may still be at work. Yet space archeology is a great name for it. It may seem a bit like fiction, but there apparently is an actual potential artifact for study and a practical approach to actually doing space archeology:

[I]nerestingly, the first artificial relic might have just been discovered over the past year whent the Pan STARRS sky survey identified the first interstellar object in the solar system, ‘Oumuamua. The abundance of interstellar asteroids with ‘Oumuamua’s kilometer-scale length was estimated a decade to be vanishingly small, making this discovery a complete surprise.
In addition, ‘Oumuamua is more elongated than any known asteroid in the solar system. But most intriguing is the fact that ‘Oumuamua deviated from the orbit one would have expected based on the sun’s gravitational field. Although such deviations could be associated with the rocket effect associated by outgassing due to heating of water ice by the sun, there was no sign of any cometary tail behind ‘Oumuamua, and calculations imply, contrary to observations, that its spin period should have changed significantly by any cometary torque. Might ‘Oumuamua have an artificial engine? Even if it happens to be a piece of natural rock as indicated by its lack of radio transmission, this rock appears to be very unusual by many counts.
The discovery of ‘Oumumua should motivate us to keep searching for interstellar debris in the solar system. Interstellar objects may not be strictly onetime visitors. A small fraction of them may get trapped by the gravitational “fishing net” cast by the sun and Jupiter. Objects passing close enough to Jupiter could lose orbital energy through their gravitational interaction and stay bound to the solar system subsequently. Indeed, an asteroid occupying an orbit indicative of such origin, BZ509, was identified recently in a retrograde orbit around Jupiter.
It is impossible to use existing chemical rockets to chase down ‘O ‘Oumumua because of its high speed, but one can contemplate missions to land on interstellar objects that are bound to the solar system. Although they represent a tiny minority of all the asteroids or comets in the solar system, their interstellar origin can be identified based on their unusual orbits around Jupiter or, in the case of comets, through their distinct (extrasolar) isotope abundance of oxygen, detectable by spectroscopic observations of their cometary tail.
A Fascinating Object
ʻOumuamua is a fascinating object, and certainly deserves further investigation. How could we discern design from non-design? It’s an issue central to archaeology, and obviously would be central to space archaeology. It would be great science to sort out criteria for detecting intelligent agency in an object in nature, especially in a situation in which we have no idea about the nature of the designer.
Finding evidence for space junk of artificial origin would provide an affirmative answer to the age-old question “Are we alone?” This would have a dramatic impact on our culture and add a new cosmic perspective to the significance of human activity. Finding a civilization dead due to war or climate change will hopefully convince us to get our act together and avoid a similar fate. But it would be even more remarkable if radar imaging or flyby photography near an interstellar relic within the solar system would show signs of an advanced technology that our civilization had not mastered as of yet.
   “Advanced technology that our civilization had not mastered.” Like astonishingly intricate blueprints for replication, function, and maintenance, written in an elegant code akin to a language, with specificity, punctuation, and superimposed reading frames, running exquisite nanotechnology in trillions of individual units that work in delicate harmony and even, in some objects, give rise to self-awareness. 
         A Breathtaking Lack of Self-Awareness
If they found a tiny fraction of that evidence for design on ʻOumuamua, it would be the scientific discovery of the millennium. Yet we find design everywhere in living things, on an immense scale. There’s a breathtaking lack of self-awareness in the scientific community about intelligent design. Much of the most fascinating and cutting edge science in many fields is design science, but ideological blinders prevent good scientists like Dr. Loeb from acknowledging that, like space archeology, cellular archeology is science at its best.