Search This Blog

Monday, 18 January 2016

Patience's rights tossed under the bus in the lone star state?

The Arrogance of "Doctor Knows Best"
Wesley J. Smith January 15, 2016 2:19 PM

The Texas Advance Directive Act (TADA) allows a hospital bioethics committee and doctors to veto wanted life-sustaining treatment if they believe the suffering thereby caused is unwarranted -- with the cost of care always in the unspoken background. It is a form of ad hoc health care rationing -- death panels, if you will -- that place the moral values and opinions of strangers over those of the patient and family.

Futile care theory would even allow strangers to veto the contents of a patient's written and expressly stated advance directive.

Texas Right to Life (among others) has been an adamant opponent of the law, attempting to get it repealed. This effort has been impeded repeatedly by the Texas Catholic Conference (see my article here) perhaps because the state's Catholic hospital association likes the law. Texas Alliance for Life (TAL) often carries the Catholic Conference's water on this matter, in agreement on this issue, ironically, with the utilitarian bioethics movement.

Why? It's a bit of a puzzlement. I don't doubt, they think it is the right thing. But it should also be noted that hospitals benefit financially by refusing wanted but expensive treatment. Perhaps their social justice inclinations see limited resources as best spent on other patients.

In the wake of the Chris Dunn case, in which the patient -- conscience and aware -- clearly wanted life-sustaining treatment to continue, TAL defenders of futile care expose the "doctor knows best" arrogance of the futile care movement. From "Balancing the Rights of Patients and Doctors," in Public Discourse (my emphasis):

A person in possession of his mental faculties is not morally bound to choose treatments whose negative effects are disproportionate to any good that could come from them. By the law of transitivity, it would seem to follow that neither his doctor nor his surrogates are either. Some may say that patients are the only ones able to judge the proportionality of suffering due to life-sustaining treatments. In this case, those treatments decreased the ability of the patient to judge.

I have heard such excuses and rationalizations in futile care controversies again and again: The patient doesn't really know what is best; the family is acting on guilt; misplaced religious belief is forcing a wrong choice; they should leave such decisions to the "experts." Bah!

Besides, Catholic moral teaching -- at least, as I understand it -- allows the patient to decide when suffering being experienced supersedes the benefit being received. It does not give that decision to doctors or bioethicists. Thus, for example, St. John Paul II decided not to try to stay alive by any means necessary. He was not prevented from doing so by others as is done in futile care cases.

The article also exhibits some mendacity by omission when it discusses the refusal by other hospitals to take Dunn, while leaving out important facts:

It is telling that, even with the assistance of the hospital over several weeks to find another care provider, none would accept Chris's transfer, indicating that other doctors agreed with the attending physician's prognosis.

But patients caught up in futile care cases usually lose money for hospitals in our capitated funding system. Moreover, this whole Texas controversy began when Houston hospitals created a futile care policy and agreed to honor such determinations made by other institutions. Heads we win, tails you lose.

If continuing wanted treatment is the wrong thing to do, that should not be decided by a Star Chamber bioethics committee made up of colleagues who reflect corporate or institutional values, meeting in secret with no real transparency or accountability. Rather, if maintaining life when that is wanted is so egregious as to be inhumane, the controversy belongs in open court, with cross examination, an official record, and a right to appeal.


Bioethics committees have a very important role to play as mediating bodies in the event of treatment disputes. But they should never be empowered to become institutionally authorized, quasi-judicial death panels.

The Watchtower Society's commentary on Goodness.

GOODNESS:
The quality or state of being good; moral excellence; virtue. Goodness is solid through and through, with no badness or rottenness. It is a positive quality and expresses itself in the performance of good and beneficial acts toward others. The most common words for “good” in the Bible are the Hebrew tohv and the Greek a·ga·thosʹ; a·ga·thosʹ is usually used in a moral or religious sense.

Jehovah’s Goodness. Jehovah God is good in the absolute and consummate sense. The Scriptures say: “Good and upright is Jehovah” (Ps 25:8), and they exclaim: “O how great his goodness is!” (Zec 9:17) Jesus Christ, though he had this quality of moral excellence, would not accept “Good” as a title, saying to one who addressed him as “Good Teacher”: “Why do you call me good? Nobody is good, except one, God.” (Mr 10:17, 18) He thus recognized Jehovah as the ultimate standard of what is good.

When Moses asked to see His glory, Jehovah replied: “I myself shall cause all my goodness to pass before your face, and I will declare the name of Jehovah before you.” Jehovah screened Moses from looking upon his face, but as he passed by (evidently by means of his angelic representative [Ac 7:53]) he declared to Moses: “Jehovah, Jehovah, a God merciful and gracious, slow to anger and abundant in loving-kindness and truth, preserving loving-kindness for thousands, pardoning error and transgression and sin, but by no means will he give exemption from punishment.”—Ex 33:18, 19, 22; 34:6, 7.

Here goodness is seen to be a quality that involves mercy, loving-kindness, and truth but does not condone or cooperate in any way with badness. On this basis David could pray to Jehovah to forgive his sins ‘for the sake of Jehovah’s goodness.’ (Ps 25:7) Jehovah’s goodness, as well as his love, was involved in the giving of his Son as a sacrifice for sins. By this he provided a means for helping those who would want that which is truly good, and at the same time he condemned badness and laid the basis for fully satisfying justice and righteousness.—Ro 3:23-26.

A Fruit of the Spirit. Goodness is a fruit of God’s spirit and of the light from his Word of truth. (Ga 5:22; Eph 5:9) It is to be cultivated by the Christian. Obedience to Jehovah’s commands develops goodness; no man has goodness on his own merit. (Ro 7:18) The psalmist appeals to God as the Source of goodness: “Teach me goodness, sensibleness and knowledge themselves, for in your commandments I have exercised faith,” and, “You are good and are doing good. Teach me your regulations.”—Ps 119:66, 68.

Goodness Bestows Benefits. Goodness can also mean beneficence, the bestowing of beneficial things upon others. Jehovah desires to express goodness toward his people, as the apostle Paul prayed for the Christians in Thessalonica: “We always pray for you, that our God may count you worthy of his calling and perform completely all he pleases of goodness and the work of faith with power.” (2Th 1:11) Many are the examples of God’s abundant goodness to those who look to him. (1Ki 8:66; Ps 31:19; Isa 63:7; Jer 31:12, 14) Moreover, “Jehovah is good to all, and his mercies are over all his works.” (Ps 145:9) With a purpose he extends good to all, that his goodness may bring many to serve him and that they may thereby gain life. Likewise, any individual exercising goodness is a blessing to his associates.—Pr 11:10.

As servants of God and imitators of him, Christians are commanded to prove what is God’s good and perfect will for them (Ro 12:2); they are to cling to what is good (Ro 12:9), to do it (Ro 13:3), to work what is good (Ro 2:10), to follow after it (1Th 5:15), to be zealous for it (1Pe 3:13), to imitate what is good (3Jo 11), and to conquer evil with it (Ro 12:21). Their doing of good is to be especially extended to those related to them in the Christian faith; additionally, it is to be practiced toward all others.—Ga 6:10.


A Related Term. Similar to the Greek word for good (a·ga·thosʹ) is another word, ka·losʹ. The latter denotes that which is intrinsically good, beautiful, well adapted to its circumstances or ends (as fine ground, or soil; Mt 13:8, 23), and that which is of fine quality, including that which is ethically good, right, or honorable (as God’s name; Jas 2:7). It is closely related in meaning to good, but may be distinguished by being translated “fine,” “right,” “honest,” or “well.”—Mt 3:10; Jas 4:17; Heb 13:18; Ro 14:21.

Friday, 15 January 2016

On single neighbour nations.

On our neighbours ' minds III

Animal Minds: In Search of the Minimal Self:

New Scientist suggested, as one of its big ideas for 2015, that the ability of humans to talk to animals would transform what it means to be human. Actually, it wouldn't. But the ability of animals to understand what humans are saying would transform what it means to be an animal.

In a 2009 issue of Nature, Johan J. Bolhuis and Clive D. L. Wynne asked a key question: Can evolution explain how minds work? They identified serious flaws in the studies of animal minds. One of them is interpreting animal behavior as if it were human behavior (anthropomorphism):

For instance, capuchin monkeys were thought to have a sense of fairness because they reject a slice of cucumber if they see another monkey in an adjacent cage, performing the same task, rewarded with a more-sought-after grape. Researchers interpreted a monkey's refusal to eat the cucumber as evidence of "inequity aversion" prompted by seeing another monkey being more generously rewarded. Yet, closer analysis has revealed that a monkey will still refuse cucumber when a grape is placed in a nearby empty cage. This suggests that the monkeys simply reject lesser rewards when better ones are available. Such findings have cast doubt on the straightforward application of Darwinism to cognition. Some have even called Darwin's idea of continuity of mind a mistake.

It is a mistake. Continuities can be merely apparent, not actual.

Consider, for example, the laptop computer vs. the typewriter. Both feature the QWERTYUIOP keyboard. That might suggest a physical continuity between the two machines. The story would run thus: Computer developers added more and more parts to the typewriter, and subtracted some, until they had transformed the typewrter into a laptop.

But of course, they didn't. They adapted a widely recognized keyboard layout to an entirely new type of machine. Continuities are created by history, not laws. If we don't know the history, we don't know whether a similarity reflects continuity or not.

Bolhuis and Wynne continue, "In other words, evolutionary convergence may be more important than common descent in accounting for similar cognitive outcomes in different animal groups."

Indeed. There is no specific type of brain uniquely associated with intelligent behavior in animals (other than humans). There is, however, convergence in intelligent behavior among vertebrates (crows) and invertebrates (octopuses).

Yet most invertebrate species do not stand out in intelligence. That fact should receive more attention than it does. The nature and origin of intelligence may be quite different from what researchers have supposed.

We have tentatively identified some patterns. Metabolism and anatomy may play a larger role than earlier suspected. For example, reptiles can show intelligent behavior when their metabolism permits, as can invertebrates with sophisticated appendages, such as octopuses and squid.

It is even worth asking whether individual animals demonstrate more intelligence if they live with humans. For one thing, they may live much longer and in more complex environments.

Some might protest that when humans eliminate the lethal razor of natural selection, "daily and hourly scrutinizing, throughout the world, every variation, even the slightest," we cause animals to become less intelligent.

But is intelligence highly selected in nature? As engineers know all too well, new solutions to any problem are accompanied by numerous failures. The "smart crow" and "smart primate" tests, for example, are devised by humans who systematically reward the animals for carefully designed feats of intelligence, but do not destroy them for failure. Blind nature rewards and penalizes more haphazardly than that.

Then there is the fact that intelligent animals often do not learn from each other. In some intelligent bird species, one bird can solve a problem but others do not learn the solution by copying that bird, even if it is obviously in their evolutionary interests to do so. Thus the species does not develop a body of knowledge. As each clever bird dies, all gains are wiped out. There is no vast history of solved problems, as there is in human civilization, for even the cleverest bird to build on.

Bolthuis and Wynne offer a sober prediction:

As long as researchers focus on identifying human-like behaviour in other animals, the job of classifying the cognition of different species will be forever tied up in thickets of arbitrary nomenclature that will not advance our understanding of the mechanisms of cognition. For comparative psychology to progress, we must study animal and human minds empirically, without naïve evolutionary presuppositions.

They're right, and here is a useful illustration of the problem: A recent article on the role of epigenetics in the mating chances of male fish refers to their social status. I questioned the use of the term "social status" in relation to the behavior of fish, and was promptly informed by a knowledgeable fish hobbyist that "All biologists understand what is meant by this."

If so, that's a problem. "Social status" is a term developed by human beings to describe a conscious experience among humans. But animals may not experience their "social status" in the same way we do. A bee may be fed "royal jelly," and become a queen -- but is she conscious of her status? Are the bees that tend her conscious of it? The insect mind may not even work in a way that enables such an understanding.

So where in this spectrum, ranging from merciful oblivion through acutely painful knowledge, do male fish fighting over mates fit? Do they experience the conflict as "selves"? We simply don't know, and that fact should inspire caution in our choice of terminology. Careless words can subvert careful questions.

Philosopher Vincent Torley, who wrote his thesis on animal mind, agrees respecting the bees, noting, "A neural representation of each individual's ranking within a group does not require its possessor to have the highly abstract notion of 'social status.' Indeed, a representation of a ranking would not require consciousness at all."

And to think that among human beings, a sense of social status is so finely honed that it can depend on concepts as abstract and immaterial as the numbers in a "Hollywood" or "power" zip code...

So What Sorts of Consciousness Might Animals Have?

Philosopher Thomas Nagel is famous for asking the question, "What Is It Like to Be a Bat?" (1974). He meant that "an organism has conscious mental states if and only if there is something that it is like to be that organism." If so, the bat experiences events, as opposed to merely being one of them.

Is the bat a "self"? A "self" is more than the mere drive to continue existing that distinguishes all life from non-life. Self must also be more than sentience (an earthworm's reaction to light, for example, need not be conscious). It implies the existence of not-self in a complex environment. It does not, however, imply immortality or a capacity for abstract thought.

Perhaps the simplest way of putting it would be that a dog not only wants something, but he knows what he wants and whether he has gotten it -- and may learn various skills along the way for getting it again, and intentionally remember them. We could call this intentionality.

Vincent Torley's thesis is titled, "The Anatomy of a Minimal Mind." I prefer to use the term "minimal self" for individual animal intelligence. As a layperson, I find it easier to understand; it does not raise so many complex questions as "mind."

For example, Middle Dog resents his position in a household because he wants to be Top Dog. I find his canine mind generally opaque. However, I can see that he consciously experiences his resentment, even if it might lack reason, moral sentiment, or empathy. And Middle Dog will know if he succeeds in his quest or not. (So, probably, will everyone else.)

Some, like philosopher Edward Feser, argue that animal minds cannot form concepts, whereas others claim that chimpanzees are entering the Stone Age.

Torley takes a middle view: Animals can, it appears, form concepts, in the sense of "same vs. different" or "more vs. less." But in the absence of language, they typically cannot process abstractions. Nor do intelligent animals create symbols, understand abstract rules, or probe beneath mere perceptions, all of which are everyday matters for humans.

They do not, for example, survey their own mental states ("Why do I think I should bark at the moon?"). Yet humans of average intelligence may often ask themselves, "Why am I doing this anyway?"

As Torley says, "A defender of animal rationality could still argue that non-human animals might still possess a very simple, primitive concept of 'self,' which is 'built into' their psyches":

I have argued that the key reason why we can reasonably impute mental states to these creatures, and describe them as having minimal minds, is that both their internal representations of the outside world (minimal maps) and their patterns of bodily movement robustly instantiate a key feature that was formerly thought to be the hallmark of mental states: intrinsic intentionality.

No "Tree of Intelligence" Pattern

Naturalism, as a philosophical commitment, requires us to start with the assumption that the human mind is merely the outcome of a long, slow, random process, winding through various forms of animal mind. This suggests we can learn a great deal about the human mind by studying animal minds.

The empirical evidence does not really support that view. Not only is the human mind more powerful by orders of magnitude, but animal minds show no consistent tree of intelligence pattern in their development that would clearly support the naturalist interpretation.


We do not yet have a theory that sheds light on why some animal species appear much more intelligent than others, leaping past conventional taxonomic classifications. But seeing past Darwin to the question of how information really originates may help us acquire one.

Darwinism Vs. the real world XXIV

The Immune System: An Army Inside You:
Howard Glicksman January 14, 2016 6:20 PM 

Editor's note: Physicians have a special place among the thinkers who have elaborated the argument for intelligent design. Perhaps that's because, more than evolutionary biologists, they are familiar with the challenges of maintaining a functioning complex system, the human body. With that in mind, Evolution News is delighted to offer this series, "The Designed Body." For the complete series,  see here  Dr. Glicksman practices palliative medicine for a hospice organization.


The body is made up of matter organized into trillions of cells that make up its tissues and organs. Since all matter must follow the laws of nature, this means that the body must do the same. In earlier articles I have shown that the body must overcome the laws of nature to survive. The sodium-potassium pumps, for example, are needed to allow each cell to control its volume and chemical content by resisting diffusion and osmosis. There must be enough albumin in the blood to resist the natural force of hydrostatic pressure and maintain blood volume and flow to the tissues. The sympathetic nervous system must increase the cardiac output and peripheral vascular resistance to elevate the blood pressure sufficiently to counteract gravity when we stand up.

With the emergence of life, not only did the cellular and organic make-up of the body require specific innovations to overcome the laws of nature and survive, it also had to learn how to deal with what it encountered in its environment. Life does not take place in a vacuum or in the imaginations of evolutionary biologists. Hemostasis and the clots it forms allow the body to prevent itself from bleeding to death when it is bumped, scraped, or cut. And the bones, muscles, and nerves work together so the body can detect danger and avoid or defend itself from it.

In my last article I showed that the body is always being exposed to microorganisms, such as bacteria, viruses, and fungi, which are present in nature but are too small to see with the naked eye. If these microbes invade the body and become widespread, they can cause a lot of damage. We saw that the first line of defense against infection is the skin and the epithelial tissues that line the respiratory, gastrointestinal, and genitourinary tracts.

If microbes breach these passive barriers and enter the tissues, the second line of defense swings into action. This is called the immune system, and it consists of numerous different cells and proteins that work together to fight and usually defeat the invading force. For our earliest ancestors to survive long enough to reproduce, they would have needed this two-pronged defense. Neither the passive barriers that protect the underlying tissues, nor the immune system, is capable, on its own, of protecting the body from life-threatening infection. They both have to be present and in working order .

In ancient times, when invaders penetrated the surrounding protective wall of a town, the defenders generally had four important tasks to perform very quickly. The first was to detect and positively identify the enemy. The second was to sound the alarm so others could help join in the defense. The third was to provide information on the enemy to those in reserve. And the fourth was to repel, wound, or kill the intruders to protect the residents. Similarly, once microbes get past the epithelium and penetrate into the tissues below, the body's immune defense must have the ability to perform these same four important tasks as well.

The first requires that the cells and proteins of the immune system have a way of detecting the presence of the microbes and be able to identify them as an invading force that needs to be destroyed. In other words, are these cells host cells (self), or foreign cells (not self)? The job of the immune system is to kill invading microorganisms, so it had better be sure that what it's encountering is indeed foreign and in need of destruction, otherwise it may end up killing its own cells by friendly fire. As with hemostasis, it's important that the immune system only turn on when it's needed and turn off and stay off when it's not.

After determining that there is a microbial invasion going on, the second task of the immune system is to send out messages so that it can bring other forces to the field. This involves releasing chemicals that not only increase the blood flow to the site of infection, allowing immune cells and proteins to leak out of the blood through the capillaries, but attracts them to the battlefield as well. This causes the area around the infection to swell up and become red -- what we call inflammation.

In addition to rallying the troops, the third task of the immune system is to provide information about the whereabouts and nature of the enemy to those in reserve. This is accomplished by some of the first responder immune cells snipping off pieces of the dead microbes and sending them to the forces in reserve so that they can better prepare for what's awaiting them.

Finally, once the weapons of the immune system have been brought to the site of infection, it's up to them to wound or kill the invading force to prevent the infection from spreading further. The immune cells and proteins involved have many different weapons at their disposal to accomplish this task.

As with most military operations, the body's immune system has regular and specialized forces. The regular forces make up what is called the innate (natural) immune system. It's the microbial defense system with which everyone is born and it is the first to encounter the enemy, reacting within minutes. But this system, on its own, is usually not able to protect the body from overwhelming infection. Many pathogenic microorganisms have the ability to remain invisible and resistant to its strategies, allowing them to proliferate and spread throughout the body.

The specialized forces are usually needed to bolster and improve the effects of the innate immune system. Together, they are called the adaptive (acquired) immune system. This system usually requires a few days to adjust to the idiosyncrasies of the invading microbes. But when it swings into action, it provides extra intelligence, firepower, and precision accuracy that usually allow it and the innate immune system to get the job done. In contrast to the innate immune system, which is present at birth, the adaptive immune system develops over time as the body is exposed to more and more different microbes in its environment.

Now that you have a general idea of how the immune system works, we will press on. Next time, we'll look at the first responders of the innate immune system and how they do their jobs. Comparing how our immune system works to a military exercise in which an enemy must be tracked down, identified, and destroyed is a very accurate analogy.


Evolutionary biologists usually point to the ability of microorganisms to develop resistance to the body's immune system and medical therapies through genetic modification as evidence that life came about by chance and the laws of nature alone. However, this assumes the presence of the hardware needed not only to survive, but also to reproduce. Once you have the system in place, it's obvious that life can change over time, which is all that the word evolution denotes. However, the ability of life to change over time doesn't necessarily mean, as evolutionary biologists suggest, that it came about by chance and the forces of nature alone. One need only consider what it takes for the body to stay alive to recognize that important truth.



Yet more on the climate change debate:Pros and Cons.

Pro & Con Arguments: "Is Human Activity Primarily Responsible for Global Climate Change?"

PRO Human Causation

Overwhelming scientific consensus says human activity is primarily responsible for global climate change. The 2010 Anderegg study found that 97-98% of climate researchers publishing most actively in their field agree that human activity is primarily responsible for global climate change. The study also found that the expertise of researchers unconvinced of human-caused climate change is "substantially below" that of researchers who agree that human activity is primarily responsible for climate change. [7] The 2013 Cook review of 11,944 peer-reviewed studies on climate change found that only 78 studies (0.7%) explicitly rejected the position that humans are responsible for global warming. [1] A separate review of 13,950 peer-reviewed studies on climate change found only 24 that rejected human-caused global warming. [5] A survey by German Scientists Bray and Von Storch found that 83.5% of climate scientists believe human activity is causing "most of recent" global climate change. [172] A separate survey in 2011 also found that 84% of earth, space, atmospheric, oceanic, and hydrological scientists surveyed said that human-induced global warming is occurring. [6] 


Rising levels of human-produced gases released into the atmosphere create a greenhouse effect that traps heat and causes global warming. As sunlight hits the earth, some of the warmth is absorbed by greenhouse gases in the atmosphere such as carbon dioxide (CO2), methane (CH4), and nitrous oxide (NO2). These gases trap heat and cause the planet to warm through a process called the greenhouse effect. [8] Since 1751 about 337 billion metric tons of CO2 have been released into the atmosphere from the burning of fossil fuels and cement production, [9] increasing atmospheric CO2 from the pre-industrial level of about 280 ppm (parts per million), to a high of 400 ppm in 2013. [10] Methane, which is increasing in the atmosphere due to agriculture and fossil fuel production, traps 84 times as much heat as CO2 for the first 20 years it is in the atmosphere, [11] and is responsible for about one-fifth of global warming since 1750. [12] Nitrous oxide, primarily released through agricultural practices, traps 300 times as much heat as CO2. [13] Over the 20th century, as the concentrations of CO2, CH4, and NO2 increased in the atmosphere, [13][14] the earth warmed by approximately 1.4°F. [99]


The rise in atmospheric CO2 over the last century was clearly caused by human activity, as it occurred at a rate much faster than natural climate changes could produce. Over the past 650,000 years, atmospheric CO2 levels did not rise above 300 ppm until the mid-20th century. [100] Atmospheric levels of CO2 have risen from about 317 ppm in 1958 to 400 ppm in 2013. [10] CO2 levels are estimated to reach 450 ppm by the year 2040. [15] According to the Scripps Institution of Oceanology, the "extreme speed at which carbon dioxide concentrations are increasing is unprecedented. An increase of 10 parts per million might have needed 1,000 years or more to come to pass during ancient climate change events." [17] Some climate models predict that by the end of the 21st century an additional 5°F-10°F of warming will occur. [16]


The specific type of CO2 that is increasing in earth's atmosphere can be directly connected to human activity. CO2 produced by burning fossil fuels such as oil and coal [18] can be differentiated in the atmosphere from natural CO2 due to its specific isotopic ratio. [101] According to the Intergovernmental Panel on Climate Change (IPCC), 20th century measurements of CO2 isotope ratios in the atmosphere confirm that rising CO2 levels are the result of human activity, not natural processes such as ocean outgassing, volcanic activity, or release from other "carbon sinks." [102] US greenhouse gas emissions from human activities in 2012 totaled 6.5 million metric tons, [19] which is equivalent to about 78.3 billion shipping containers filled with greenhouse gases. [20]


Average temperatures on earth have increased at a rate far faster than can be explained by natural climate changes. A 2008 study compared data from tree rings, ice cores, and corals over the past millennium with recent temperature records. The study created the famous "hockey stick" graph, showing that the rise in earth's temperature over the preceding decade had occurred at a rate faster than any warming period over the last 1,700 years. [23] In 2012 the Berkeley scientists found that the average temperature of the earth’s land increased 2.5°F over 250 years (1750-2000), with 1.5°F of that increase in the last 50 years. [21] Lead researcher Richard A. Muller, PhD, said "it appears likely that essentially all of this increase [in temperature] results from the human emission of greenhouse gases." [22] In 2013, a surface temperature study published in Science found that global warming over the past 100 years has proceeded at a rate faster than at any time in the past 11,300 years. [3] According to the IPCC’s 2014 Synthesis Report, human actions are "extremely likely" (95-100% confidence) to have been the main cause of 20th century global warming, and the surface temperature warming since the 1950s is "unprecedented over decades to millennia." [24]


Natural changes in the sun's activity cannot explain 20th century global warming. According to a Dec. 2013 study in Nature Geoscience, the sun has had only a "minor effect" on the Northern Hemisphere climate over the past 1,000 years, and global warming from human-produced greenhouse gases has been the primary cause of climate change since 1900. [26] Another 2013 study found that solar activity could not have contributed to more than 10% of the observed global warming over the 20th century. [27] Measurements in the upper atmosphere from 1979-2009 show the sun's energy has gone up and down in cycles, with no net increase. [28] According to a 2013 IPCC report, there is "high confidence" (8 out of 10 chance) that changes in the sun's radiation could not have caused the increase in the earth's surface temperature from 1986-2008. [29] Although warming is occurring in the lower atmosphere (troposphere), the upper atmosphere (stratosphere) is actually cooling. If the sun were driving global warming, there would be warming in the stratosphere also, not cooling. [103] 


Global warming caused by human-produced greenhouse gases is causing the Arctic ice cap to melt at an increasing rate. From 1953–2006, Arctic sea ice declined 7.8% per decade. Between 1979 and 2006, the decline was 9.1% each decade. [105] As of 2014, Arctic sea ice was being lost at a rate of 13.3% per decade. [163] As the Arctic ice cover continues to decrease, the amount of the sun’s heat reflected by the ice back into space also decreases. This positive-feedback loop amplifies global warming at a rate even faster than previous climate models had predicted. [30] Some studies predict the Arctic could become nearly ice free sometime between 2020-2060. [164]


Sea levels are rising at an unprecedented rate due to global warming. As human-produced greenhouse gases warm the planet, sea levels are rising due to thermal expansion of warming ocean waters as well as melt water from receding glaciers and the polar ice cap. [165] According to the IPCC, there has been a "substantial" human contribution to the global mean sea-level rise since the 1970s, and there is "high confidence" (8 out of 10 chance) that the rate of sea-level rise over the last half century has accelerated faster than it has over the previous 2,000 years. [29] A 2006 study found that "significant acceleration" of sea-level rise occurred from 1870 to 2004. [106] Between 1961 and 2003 global sea levels rose 8 inches. [102] An Oct. 2014 study published in the Proceedings of the National Academy of Sciences concluded that the rate of sea level rise over the past century is unprecedented over the last 6,000 years. [32] [33] A separate Oct. 2014 study said that the global sea level is likely to rise 31 inches by 2100, with a worst case scenario rise of 6 feet. [34] Climate Central predicts that 147 to 216 million people live in areas that will be below sea level or regular flood areas by the end of the century if human-produced greenhouse gas emissions continue at their current rate. [35]


Ocean acidity levels are increasing at an unprecedented rate that can only be explained by human activity. As excess human-produced CO2 in the atmosphere is absorbed by the oceans, the acidity level of the water increases. Acidity levels in the oceans are 25-30% higher than prior to human fossil fuel use. [107] According to a 2014 US Government Accountability Office (GAO) report, oceans have absorbed about 30% of the CO2 emitted by humans over the past 200 years, and ocean acidity could rise approximately 100-200 percent above preindustrial levels by 2100. [36] According to a 2013 report from the World Meteorological Organization, the current acceleration in the rate of ocean acidification "appears unprecedented" over the last 300 million years. [37] High ocean acidity levels threaten marine species, [16] and slows the growth of coral reefs. [38] According to a 2014 report by the Convention on Biological Diversity, "it is now nearly inevitable" that within 50-100 years continued human produced CO2 emissions will increase ocean acidity to levels that "will have widespread impacts, mostly deleterious, on marine organisms and ecosystems." [39]


Ocean temperatures are rising at an unprecedented rate due to global warming, and are causing additional climate changes. The IPCC stated in a 2013 report that due to human-caused global warming, it is "virtually certain" (99-100% probability) that the upper ocean warmed between 1971 and 2010. [29] An Oct. 2014 Nature Climate Change study said that the oceans are the "dominant reservoir of heat uptake in the climate system." [40] A separate Oct. 2014 study found that the oceans absorb more than 90% of the heat generated by human-caused global warming. [41] Since 1970 the upper ocean (above 700 meters) has been warming 24-55% faster than previous studies had predicted. [41] A May 2013 study published in Geophysical Research Letters found that between 1958-2009 the rates of warming in the lower ocean (below 700m) "appear to be unprecedented." [42] According to an Oct. 2013 study, the middle depths of the Pacific Ocean have warmed "15 times faster in the last 60 years than they did during apparent natural warming cycles in the previous 10,000." [43] Warmer ocean waters can harm coral reefs and impact many species including krill, which are vital to the marine food chain and which reproduce significantly less in warmer water. [166] Warming oceans also contribute to sea level rise due to thermal expansion, and warmer ocean waters can add to the intensity of storm systems. [167]


Glaciers are melting at unprecedented rates due to global warming, causing additional climate changes. About a quarter of the globe's glacial loss from 1851-2010, and approximately two thirds of glacial loss between 1991-2010, is attributable directly to global warming caused by human-produced greenhouse gases. [45] According to the National Snow and Ice Data Center, global warming from human-produced greenhouse gases is a primary cause of the "unprecedented" retreat of glaciers around the world since the early 20th century. [44] Since 1980 glaciers worldwide have lost nearly 40 feet (12 meters) in average thickness. [110] According to a 2013 IPCC report, "glaciers have continued to shrink almost worldwide" over the prior two decades, and there is "high confidence" (about an 8 out of 10 chance) that Northern Hemisphere spring snow continues to decrease. [29] If the glaciers forming the Greenland ice sheet were to melt entirely, global sea levels could increase by up to 20 feet. [168] Melting glaciers also change the climate of the surrounding region. With the loss of summer glacial melt water, the temperatures in rivers and lakes increase. According to the US Geological Service, this disruption can include the "extinction of temperature sensitive aquatic species." [169]


Human-caused global warming is changing weather systems and making heat waves and droughts more intense and more frequent. The May 2014 National Climate Assessment report said human-caused climate changes, such as increased heat waves and drought, "are visible in every state." [16] A Sep. 2014 American Meteorological Society study found that human-caused climate change "greatly increased" (up to 10 times) the risk for extreme heat waves in 2013. [46] According to an Aug. 2012 study published in the Proceedings of the National Academy of Sciences, there is a "high degree of confidence" that the Texas and Oklahoma heat waves and drought of 2011, and heat waves and drought in Moscow in 2010, "were a consequence of global warming" and that "extreme anomalies" in weather are becoming more common as a direct consequence of human-caused climate change. [47] A 2015 study found that globally, 75% of extremely hot days are attributable to warming caused by human activity. [174]


Dramatic changes in precipitation, such as heavier storms and less snow, are another sign that humans are causing global climate change. As human-produced greenhouse gases heat the planet, increased humidity (water vapor in the atmosphere) results. Water vapor is itself a greenhouse gas. [112] In a process known as a positive feedback loop, more warming causes more humidity which causes even more warming. [113] Higher humidity levels also cause changes in precipitation. According to a 2013 report published in the Proceedings of the National Academy of Sciences, the recorded changes in precipitation over land and oceans "are unlikely to arise purely due to natural climate variability." [48] Higher temperatures from global warming are also causing some mountainous areas to receive rain rather than snow. According to researchers at the Scripps Institution of Oceanography, up to 60% of the changes in river flow, winter air temperature, and snow pack in the western United States (1950-1999) were human-induced. [111] Since 1991, heavy precipitation events have been 30% above the 1901-1960 average in the Northeast, Midwest, and upper Great Plains regions. [16] A 2015 study found that global warming caused by human actions has increased extreme precipitation events by 18% across the globe, and that if temperatures continue to rise an increase of 40% can be expected. [174]


Permafrost is melting at unprecedented rates due to global warming, causing further climate changes. According to a 2013 IPCC report there is "high confidence" (about an 8 out of 10 chance) that anthropogenic global warming is causing permafrost, a subsurface layer of frozen soil, to melt in high-latitude regions and in high-elevation regions. [49] As permafrost melts it releases methane, a greenhouse gas that absorbs 84 times more heat than CO2 for the first 20 years it is in the atmosphere, creating even more global warming in a positive feedback loop. [50][51] By the end of the 21st century, warming temperatures in the Arctic will cause a 30%-70% decline in permafrost. [52] According to a 2012 report, as human-caused global warming continues, Arctic air temperatures are expected to increase at twice the global rate, increasing the rate of permafrost melt, changing the local hydrology, and impacting critical habitat for native species and migratory birds. [53] According to the 2014 National Climate Assessment, some climate models suggest that near-surface permafrost will be "lost entirely" from large parts of Alaska by the end of the 21st century. [16]

CON Human Causation

More than one thousand scientists disagree that human activity is primarily responsible for global climate change. In 2010 Climate Depot released a report featuring more than 1,000 scientists, several of them former UN IPCC scientists, who disagreed that humans are primarily responsible for global climate change. [55] The Cook review [1] of 11,944 peer-reviewed studies found 66.4% of the studies had no stated position on anthropogenic global warming, and while 32.6% of the studies implied or stated that humans are contributing to climate change, only 65 papers (0.5%) explicitly stated "that humans are the primary cause of recent global warming." [54] A 2012 Purdue University survey found that 47% of climatologists challenge the idea that humans are primarily responsible for climate change and instead believe that climate change is caused by an equal combination of humans and the environment (37%), mostly by the environment (5%), or that there’s not enough information to say (5%). [173] In 2014 a group of 15 scientists dismissed the US National Climate Assessment as a "masterpiece of marketing," that was "grossly flawed," and called the NCA’s assertion of human-caused climate change "NOT true." [56]


Earth's climate has always warmed and cooled, and the 20th century rise in global temperature is within the bounds of natural temperature fluctuations over the past 3,000 years. Although the planet has warmed 1-1.4°F over the 20th century, it is within the +/- 5°F range of the past 3,000 years. [114] A 2003 study by researchers at the Harvard-Smithsonian Center for Astrophysics found that "many records reveal that the 20th century is probably not the warmest nor a uniquely extreme climatic period of the last millennium." [115] A 2005 study published in Nature found that "high temperatures - similar to those observed in the twentieth century before 1990 - occurred around AD 1000 to 1100" in the Northern Hemisphere. [116] A 2013 study published in Boreas found that summer temperatures during the Roman Empire and Medieval periods were "consistently higher" than temperatures during the 20th century. [59] According to a 2010 study in the Chinese Science Bulletin, the recent global warming period of the 20th century is the result of a natural 21-year temperature oscillation, and will give way to a "new cool period in the 2030s." [74]


Rising levels of atmospheric CO2 do not necessarily cause global warming, which contradicts the core thesis of human-caused climate change. Earth's climate record shows that warming has preceded, not followed, a rise in CO2. According to a 2003 study published in Science, measurements of ice core samples show that over the last four climactic cycles (past 240,000 years), periods of natural global warming preceded global increases in CO2. [117] In 2010 the Proceedings of the National Academy of Sciences published a study of the earth's climate 460-445 million years ago which found that an intense period of glaciation, not warming, occurred when CO2 levels were 5 times higher than they are today. [4] According to ecologist and former Director of Greenpeace International Patrick Moore, PhD, "there is some correlation, but little evidence, to support a direct causal relationship between CO2 and global temperature through the millennia." [60]


Human-produced CO2 is re-absorbed by oceans, forests, and other "carbon sinks," negating any climate changes. According to a 2011 study published in the Asia-Pacific Journal of Atmospheric Science, many climate models that predict additional global warming to occur from CO2 emissions "exaggerate positive feedbacks and even show positive feedbacks when actual feedbacks are negative." [75] About 50% of the CO2 released by the burning of fossil fuels and other human activities has already been re-absorbed by the earth’s carbon sinks. [118] From 2002-2011, 26% of human-caused CO2 emissions were absorbed specifically by the world’s oceans. [61] A 2010 study published in the Proceedings of the National Academy of Sciences found evidence that forests are increasing their growth rates in response to elevated levels of CO2, [62] which will in turn, lower atmospheric CO2 levels in a negative feedback. According to an Aug. 2012 study in Nature, the rate of global carbon uptake by the earth's carbon sinks, such as its forests and oceans, doubled from 1960-2010 and continues to increase. [64] 


CO2 is already saturated in earth’s atmosphere, and more CO2, manmade or natural, will have little impact on climate. As CO2 levels in the atmosphere rise, the amount of additional warming caused by the increased concentration becomes less and less pronounced. [65] According to Senate testimony by William Happer, PhD, Professor of Physics at Princeton University, "[a]dditional increments of CO2 will cause relatively less direct warming because we already have so much CO2 in the atmosphere that it has blocked most of the infrared radiation that it can. The technical jargon for this is that the CO2 absorption band is nearly 'saturated' at current CO2 levels." [66] According to the Heartland Institute's 2013 Nongovernmental International Panel on Climate Change (NIPCC) report, "it is likely rising atmospheric CO2 concentrations will have little impact on future climate." [67]


Global warming and cooling are primarily caused by fluctuations in the sun's heat (solar forcing), not by human activity. Over the past 10,000 years, solar minima (reduced sun spot activity) have been "accompanied by sharp climate changes." [68] Between 1900 and 2000 solar irradiance increased 0.19%, and correlated with the rise in US surface temperatures over the 20th century. [114] According to a 2007 study published in Energy & Environment, "variations in solar activity and not the burning of fossil fuels are the direct cause of the observed multiyear variations in climatic responses." [69] In a 2012 study by Willie Soon, PhD, Physicist at the Harvard-Smithsonian Center for Astrophysics, a strong correlation between solar radiation and temperatures in the Arctic over the past 130 years was identified. [70] According to a 2012 study published in the Journal of Atmospheric and Solar-Terrestrial Physics, "up to 70% of the observed post-1850 climate change and warming could be associated to multiple solar cycles." [71]


The rate of global warming has slowed over the last decade even though atmospheric CO2 continues to increase. The Intergovernmental Panel on Climate Change (IPCC) recognized a slowdown in global warming over the past 15 years in its 2013 report. [29] According to the Heartland Institute's 2013 NIPCC report, the earth "has not warmed significantly for the past 16 years despite an 8% increase in atmospheric CO2." [67] In Aug. 2014 a study in the Open Journal of Statistics analyzed surface temperature records and satellite measurements of the lower atmosphere and confirmed that this slowdown in global warming has occurred. [72] According to Emeritus Professor of Meteorology at the Massachusetts Institute of Technology Richard Lindzen, PhD, the IPCC's "excuse for the absence of warming over the past 17 years is that the heat is hiding in the deep ocean. However, this is simply an admission that the [climate] models fail to simulate the exchanges of heat between the surface layers and the deeper oceans..." [73] 


Predictions of accelerating human-caused climate change are based upon computerized climate models that are inadequate and incorrect. Climate models have been unable to simulate major known features of past climate such as the ice ages or the very warm climates of the Miocene, Eocene, and Cretaceous periods. If models cannot replicate past climate changes they should not be trusted to predict future climate changes. [58] A 2011 Asia-Pacific Journal of Atmospheric Science study using observational data rather than computer climate models concluded that "the models are exaggerating climate sensitivity" and overestimate how fast the earth will warm as CO2 levels increase. [75] Two other studies using observational data found that IPCC projections of future global warming are too high. [76] [97] In a 2014 article, climatologist and former NASA scientist Roy Spencer, PhD, concluded that 95% of climate models have "over-forecast the warming trend since 1979." [77] According to Emeritus Professor of Geography at the University of Winnipeg, Tim Ball, PhD, "IPCC computer climate models are the vehicles of deception… [T]hey create the results they are designed to produce." [78]


Sea levels have been steadily rising for thousands of years, and the increase has nothing to do with humans. A 2014 report by the Global Warming Policy Foundation found that a slow global sea level rise has been ongoing for the last 10,000 years. [79] When the earth began coming out of the Pleistocene Ice Age 18,000 years ago, sea levels were about 400 feet lower than they are today and have been steadily rising ever since. [60] According to Professor of Earth and Atmospheric Sciences at the Georgia Institute of Technology, Judith Curry, PhD "it is clear that natural variability has dominated sea level rise during the 20th century, with changes in ocean heat content and changes in precipitation patterns." [80] Freeman Dyson, Emeritus Professor of Mathematical Physics and Astrophysics at the Institute for Advanced Study at Princeton University, has stated that there is "no evidence" that rising sea levels are due to anthropogenic climate change. [81]


The acidity levels of the oceans are within past natural levels, and the current rise in acidity is a natural fluctuation, not the result of human caused climate change. [120] The pH of average ocean surface water is 8.1 and has only decreased 0.1 since the beginning of the industrial revolution (neutral is pH 7, acid is below pH 7). [121] In 2010 Science published a study of ocean acidity levels over the past 15 million years, finding that the "samples record surface seawater pH values that are within the range observed in the oceans today." [82] Increased atmospheric CO2 absorbed by the oceans results in higher rates of photosynthesis and faster growth of ocean plants and phytoplankton, which increases pH levels keeping the water alkaline, not acidic. [60] According to a 2010 paper by the Science and Public Policy Institute, "our harmless emissions of trifling quantities of carbon dioxide cannot possibly acidify the oceans." [63]


Glaciers have been growing and receding for thousands of years due to natural causes, not human activity. The IPCC predicted that Himalayan glaciers would likely melt away by 2035, a prediction they disavowed in 2010. [83] In 2014 a study of study of 2,181 Himalayan glaciers from 2000-2011 showed that 86.6% of the glaciers were not receding. [84] According to a 2013 study of ice cores published in Nature Geoscience, the current melting of glaciers in Western Antarctica is due to "atmospheric circulation changes" that have "caused rapid warming over the West Antarctic Ice Sheet" and cannot be directly attributed to human caused climate change. [85] According to one of the study authors, "[i]f we could look back at this region of Antarctica in the 1940s and 1830s, we would find that the regional climate would look a lot like it does today, and I think we also would find the glaciers retreating much as they are today." [86] According to Christian Schlüchter, Professor of Geology at the University of Bern, 4,000 year old tree remains have been found beneath retreating glaciers in the Swiss Alps, indicating that they were previously glacier-free. According to Schlüchter, the current retreat of glaciers in the Alps began in the mid-19th century, before large amounts of human caused CO2 had entered the atmosphere. [87]


Deep ocean currents, not human activity, are a primary driver of natural climate warming and cooling cycles. Changes in ocean currents are primarily responsible for the melting Greenland ice sheet, Arctic sea ice, and Arctic permafrost. Over the 20th century there have been two Arctic warming periods with a cooling period (1940-1970) in between. According to a 2009 study in Geophysical Research Letters, natural shifts in the ocean currents are the major cause of these climate changes, not human-generated greenhouse gases. [124] According to William Gray, PhD, Emeritus Professor of Atmospheric Science at Colorado State University, most of the climate changes over the last century are natural and "due to multi-decadal and multi-century changes in deep global ocean currents." [122] Global cooling from 1940 to the 1970s, and warming from the 1970s to 2008, coincided with fluctuations in ocean currents and cloud cover driven by the Pacific Decadal Oscillation (PDO) - a naturally occurring rearrangement in atmospheric and oceanic circulation patterns. [123] According to a 2014 article by Don Easterbrook, PhD, Professor Emeritus of Geology at Western Washington University, the "PDO cool mode has replaced the warm mode in the Pacific Ocean, virtually assuring us of about 30 years of global cooling, perhaps much deeper than the global cooling from about 1945 to 1977." [88]



Increased hurricane activity and other extreme weather events are a result of natural weather patterns, not human-caused climate change. According to a 2013 report from the Tropical Meteorology Project at Colorado State University, the increase in human-produced CO2 over the past century has had "little or no significant effect" on global tropical cyclone activity. The report further states that specific hurricanes, including Sandy, Ivan, Katrina, Rita, Wilma, and Ike, were not a direct consequence of human-caused global warming. [89] Between 1995-2015 increased hurricane activity (including Katrina) was recorded, however, according to the NOAA, it was not the result of human-induced climate change; it was the result of cyclical tropical cyclone patterns, driven primarily by natural ocean currents. [125] Many types of recorded extreme weather events over the past half-century have actually become less frequent and less severe. [93] Professor of Earth and Atmospheric Sciences at the Georgia Institute of Technology, Judith Curry, PhD, states that she is "unconvinced by any of the arguments that I have seen that attributes a single extreme weather event, a cluster of extreme weather events, or statistics of extreme weather events" to human-caused climate change. [90] Richard Lindzen, PhD, Emeritus Professor of Meteorology at the Massachusetts Institute of Technology, also states that there is a lack of evidence connecting extreme weather events such as hurricanes, tornadoes, droughts, or floods, to human-caused global warming. [92]







On taking earth's temperature.

Thursday, 14 January 2016

Why debate Darwin?: Mr.Berlinski's two cents.

David Berlinski: Does Darwin Matter?
David Berlinski September 29, 2009 8:17 AM 

ENV: How do the scientific issues you write about affect the way we live? Why should the Darwin question matter to people who don't normally concern themselves with scientific theories?

DB: I think of the Darwinian debate in the way that Dickens thought of Jardynce v Jarndyce in Bleak House. It is awfully easy to be sucked into it, and once suckered, awfully difficult to get out. I have seen it so often. A man wakes and because has read a book or scanned an essay, he is persuaded that he can make a contribution. He is eager to make it. He offers his opinion on the Internet and is gratified by the prospect of the congratulations that he is shortly to receive. No one pays the slightest attention. He then discovers that to be heard, it is necessary that he amplifies his level of abuse. He does that, referring to the Discovery Institute as the Dishonesty Institute. Repeating the phrase as he moves his bowels affords him an unexpected pleasure. As his influence remains insignificant, his indignation mounts. In the morning, he scuttles to his computer to check his own postings; satisfied when he finds them, and beside himself when he fails. His appetite for conflict sharpens. He becomes determined to exaggerate every issue; and to magnify trivialities. Sooner or later, his Internet presence seems real, and his real life unreal. He ends in the state achieved by almost every Internet blogger: He commences to gibber repetitively. Glen Davidson, who posts to David Klinghoffer's blog, has recently entered the gibbering state.

It is all very sad. I have warned about the phenomenon many times.

Does Darwin matter? Yes, of course it matters. It matters a great deal. It matters whether the theory is true because for better or worse we value the truth and struggle to find it; but it would matter far more were we able to say once and for all that the theory is false. Darwinism involves a way of thought in biology, and were it to go, it would take a great many assumptions along with it. Just think of vitalism, for example. To say a word in its favor is at once to be accused of the cheapest kind of intellectual sentimentality. We know better and if we do not know better, they do. But hold on, please do. If by vitalism one means something like the 19th century idea of a vital fluid that informs living systems, then I am with them. That is so much sentimentality. But if by vitalism one means the thesis that living systems cannot be completely explained in terms of their physics or their chemistry -- what then? Something must explain the difference, no? And if it is not a fluid, as naïve 19th-century biologists sometimes thought, it does not follow that it is nothing.

Nothing in biology makes sense except in the light of evolution.

This remark is half right: Nothing in biology does make sense. It is for the biology of the future to start making sense of it. If that in the end involves religious ideas or even religious I, that's fine with me. Let's ask the questions first, and reject the wrong answers when we know that they are wrong.

Biology as tech /Biology as art II

Coming Next Month, Michael Denton and The Biology of the Baroque; See the Trailer Now!
David Klinghoffer January 13, 2016 2:36 PM 

My family and I were watching Ninotchka last night -- the sly story of a dour, emotionally repressed Soviet official who travels to Paris and reluctantly discovers the allure of beauty, luxury, and love. Greta Garbo plays the title character. It's full of great lines, and I was especially tickled by Ninotchka's pre-transformation dismissal of herself as, "Just what you see. A tiny cog in the great wheel of evolution."

Cover with border made me think of our upcoming documentary, The Biology of the Baroque: The Mystery of Non-Adaptive Order, which you'll have the opportunity to enjoy when it premieres on YouTube next month. The stern, nearly robotic Ninotchka at first disclaims all interest in the lights of Paris or any of the city's other charms but only wants to inspect its sewers and other infrastructure, "from a technical standpoint." In a very similar way, evolutionary thinking asks us to ignore life's superabundance of numinous order and baroque artistry.


The video is based on a novel and incisive argument from Discovery Institute biologist Michael Denton in his new book Evolution: Still a Theory in Crisis, to be published on January 26. You can see the trailer now:
Evolutionists have good reason for demanding that we avert our eyes from biology's delicate artfulness. None of that, after all, is explicable in light of the Darwinian theory that natural selection retains only what is useful from a "technical standpoint" of reproductive successive. In the book and the video, directed by Center for Science & Culture associate director John West, Dr. Denton puts this quality of superfluous, luxurious "non-adaptive order" front and center.

Evolution: Still a Theory in Crisis follows in the tracks of Denton's groundbreaking work of thirty-plus years ago, Evolution: A Theory in Crisis. The latter inspired a rising generation of pioneers in the field of intelligent design, notably Michael Behe. The forthcoming book is no mere update, however -- it reveals powerful new evidence of design in nature and opens a fresh frontier for the science of ID.

Dr. Denton concedes that when he wrote his first book, he did not recognize the abundance of non-adaptive features in life -- a realization that he details with authority in the new book. Watch for The Biology of the Baroque in this space on February 12.

Wednesday, 13 January 2016

The minority report:Nothing in Darwinism makes sense apart from biology.

Why Do We Invoke Darwin?:
Darwin's theory of evolution offers a sweeping explanation of the history of life, from the earliest microscopic organisms billions of years ago to all the plants and animals around us today.


By Philip Skell

Darwin's theory of evolution offers a sweeping explanation of the history of life, from the earliest microscopic organisms billions of years ago to all the plants and animals around us today. Much of the evidence that might have established the theory on an unshakable empirical foundation, however, remains lost in the distant past. For instance, Darwin hoped we would discover transitional precursors to the animal forms that appear abruptly in the Cambrian strata. Since then we have found many ancient fossils – even exquisitely preserved soft-bodied creatures – but none are credible ancestors to the Cambrian animals.

Despite this and other difficulties, the modern form of Darwin's theory has been raised to its present high status because it's said to be the cornerstone of modern experimental biology. But is that correct? "While the great majority of biologists would probably agree with Theodosius Dobzhansky's dictum that 'nothing in biology makes sense except in the light of evolution,' most can conduct their work quite happily without particular reference to evolutionary ideas," A.S. Wilkins, editor of the journal BioEssays, wrote in 2000.1 "Evolution would appear to be the indispensable unifying idea and, at the same time, a highly superfluous one."

I would tend to agree. Certainly, my own research with antibiotics during World War II received no guidance from insights provided by Darwinian evolution. Nor did Alexander Fleming's discovery of bacterial inhibition by penicillin. I recently asked more than 70 eminent researchers if they would have done their work differently if they had thought Darwin's theory was wrong. The responses were all the same: No.

I also examined the outstanding biodiscoveries of the past century: the discovery of the double helix; the characterization of the ribosome; the mapping of genomes; research on medications and drug reactions; improvements in food production and sanitation; the development of new surgeries; and others. I even queried biologists working in areas where one would expect the Darwinian paradigm to have most benefited research, such as the emergence of resistance to antibiotics and pesticides. Here, as elsewhere, I found that Darwin's theory had provided no discernible guidance, but was brought in, after the breakthroughs, as an interesting narrative gloss.

In the peer-reviewed literature, the word "evolution" often occurs as a sort of coda to academic papers in experimental biology. Is the term integral or superfluous to the substance of these papers? To find out, I substituted for "evolution" some other word – "Buddhism," "Aztec cosmology," or even "creationism." I found that the substitution never touched the paper's core. This did not surprise me. From my conversations with leading researchers it had became clear that modern experimental biology gains its strength from the availability of new instruments and methodologies, not from an immersion in historical biology.

When I recently suggested this disconnect publicly, I was vigorously challenged. One person recalled my use of Wilkins and charged me with quote mining. The proof, supposedly, was in Wilkins's subsequent paragraph:

"Yet, the marginality of evolutionary biology may be changing. More and more issues in biology, from diverse questions about human nature to the vulnerability of ecosystems, are increasingly seen as reflecting evolutionary events. A spate of popular books on evolution testifies to the development. If we are to fully understand these matters, however, we need to understand the processes of evolution that, ultimately, underlie them."


In reality, however, this passage illustrates my point. The efforts mentioned there are not experimental biology; they are attempts to explain already authenticated phenomena in Darwinian terms, things like human nature. Further, Darwinian explanations for such things are often too supple: Natural selection makes humans self-centered and aggressive – except when it makes them altruistic and peaceable. Or natural selection produces virile men who eagerly spread their seed – except when it prefers men who are faithful protectors and providers. When an explanation is so supple that it can explain any behavior, it is difficult to test it experimentally, much less use it as a catalyst for scientific discovery.Darwinian evolution – whatever its other virtues – does not provide a fruitful heuristic in experimental biology. This becomes especially clear when we compare it with a heuristic framework such as the atomic model, which opens up structural chemistry and leads to advances in the synthesis of a multitude of new molecules of practical benefit. None of this demonstrates that Darwinism is false. It does, however, mean that the claim that it is the cornerstone of modern experimental biology will be met with quiet skepticism from a growing number of scientists in fields where theories actually do serve as cornerstones for tangible breakthroughs.

Philip S. Skell tvk@psu.edu is Emeritus Evan Pugh Professor at Pennsylvania State University, and a member of the National Academy of Sciences. His research has included work on reactive intermediates in chemistry, free-atom reactions, and reactions of free carbonium ions.

He can be contacted at tvk@psu.edu.