Search This Blog

Tuesday, 14 September 2021

Deism: a brief history.

 Deism (/ˈdɪzəm/ DEE-iz-əm[1][2] or /ˈd.ɪzəm/ DAY-iz-əm; derived from Latin deus, meaning "god")[3] is the philosophical position and rationalistic theology[4] that rejects revelation as a source of divine knowledge, and asserts that empirical reason and observation of the natural world are exclusively logical, reliable, and sufficient to determine the existence of a Supreme Being as the creator of the universe.[3][4][5][6][7][8] Deism is also defined as the belief in the existence of God solely based on rational thought, without any reliance on revealed religions or religious authority.[3][4][5][6][7] Deism emphasizes the concept of natural theology, that is, God's existence is revealed through nature.[3][4][5][6][8]

Since the 17th century and during the Age of Enlightenment, especially in 18th-century England and France, various Western philosophers and theologians formulated a critical rejection of the religious texts belonging to the many institutionalized religions and began to appeal only to truths that they felt could be established by reason alone as the exclusive source of divine knowledge.[4][5][6][7] Such philosophers and theologians were called "Deists", and the philosophical/theological position that they advocated is called "Deism".[4][5][6][7] Deism as a distinct philosophical and intellectual movement declined towards the end of the 18th century.[4] Some of its tenets continued to live on as part of other intellectual movements, like Unitarianism, and it continues to have advocates today.[3]

Atheism: a brief history.

 Atheism, in the broadest sense, is an absence of belief in the existence of deities. Less broadly, atheism is a rejection of the belief that any deities exist. In an even narrower sense, atheism is specifically the position that there are no deities. Atheism is contrasted with theism, which in its most general form is the belief that at least one deity exists.


The etymological root for the word atheism originated before the 5th century BCE from the ancient Greek ἄθεος (atheos), meaning "without god(s)". In antiquity, it had multiple uses as a pejorative term applied to those thought to reject the gods worshiped by the larger society, those who were forsaken by the gods, or those who had no commitment to belief in the gods. The term denoted a social category created by orthodox religionists into which those who did not share their religious beliefs were placed. The actual term atheism emerged first in the 16th century. With the spread of freethoughtskeptical inquiry, and subsequent increase in criticism of religion, application of the term narrowed in scope. The first individuals to identify themselves using the word atheist lived in the 18th century during the Age of Enlightenment. The French Revolution, noted for its "unprecedented atheism", witnessed the first significant political movement in history to advocate for the supremacy of human reason.

Arguments for atheism range from philosophical to social and historical approaches. Rationales for not believing in deities include the lack of evidence, the problem of evil, the argument from inconsistent revelations, the rejection of concepts that cannot be falsified, and the argument from nonbelief. Nonbelievers contend that atheism is a more parsimonious position than theism and that everyone is born without beliefs in deities; therefore, they argue that the burden of proof lies not on the atheist to disprove the existence of gods but on the theist to provide a rationale for theism. Although some atheists have adopted secular philosophies (e.g. secular humanism), there is no ideology or code of conduct to which all atheists adhere.

Since conceptions of atheism vary, accurate estimations of current numbers of atheists are difficult. According to global Win-Gallup International studies, 13% of respondents were "convinced atheists" in 2012, 11% were "convinced atheists" in 2015, and in 2017, 9% were "convinced atheists". However, other researchers have advised caution with WIN/Gallup figures since other surveys which have used the same wording for decades and have a bigger sample size have consistently reached lower figures. An older survey by the British Broadcasting Corporation (BBC) in 2004 recorded atheists as comprising 8% of the world's population. Other older estimates have indicated that atheists comprise 2% of the world's population, while the irreligious add a further 12%. According to these polls, Europe and East Asia are the regions with the highest rates of atheism. In 2015, 61% of people in China reported that they were atheists. The figures for a 2010 Eurobarometer survey in the European Union (EU) reported that 20% of the EU population claimed not to believe in "any sort of spirit, God or life force", with France (40%) and Sweden (34%) representing the highest values.

Sikhism: a brief history.

 Sikhism or Sikhi (Punjabi: ਸਿੱਖੀ Sikkhī, [ˈsɪkːʰiː], from ਸਿੱਖ, Sikh, 'disciple', 'seeker', or 'learner') is an Indian Dharmic religion that originated in the Punjab region of the Indian subcontinent around the end of the 15th century CE. Sikhism is one of the youngest of the major religions and the world's fifth-largest organized religion, with about 25–30 million Sikhs as of the early 21st century . However, according to rough estimates, there are around 120–150 million (12–15 crore) Sahajdhari or non-khalsa Nanakpanthi sikhs across the world who also believe in 10 Sikh Gurus and Guru Granth Sahib.


Sikhism developed from the spiritual teachings of Guru Nanak, the first Guru (1469–1539), and of the nine Sikh gurus who succeeded him. The tenth guru, Gobind Singh (1666–1708), named the Sikh scripture Guru Granth Sahib as his successor, bringing to a close the line of human gurus and establishing the scripture as the last eternal 11th living guru, a religious spiritual/life guide for Sikhs. Guru Nanak taught that living an "active, creative, and practical life" of "truthfulness, fidelity, self-control and purity" is above metaphysical truth, and that the ideal man "establishes union with God, knows His Will, and carries out that Will". Guru Hargobind, the sixth Sikh Guru (1606–1644), established the concept of mutual co-existence of the miri ('political'/'temporal') and piri ('spiritual') realms.

The Sikh scripture opens with the Mul Mantar (ਮੂਲ ਮੰਤਰ), fundamental prayer about ik onkar (ੴ, 'One God'). The core beliefs of Sikhism, articulated in the Guru Granth Sahib, include faith and meditation on the name of the one creator; divine unity and equality of all humankind; engaging in seva ('selfless service'); striving for justice for the benefit and prosperity of all; and honest conduct and livelihood while living a householder's life. Following this standard, Sikhism rejects claims that any particular religious tradition has a monopoly on Absolute Truth.

Sikhism emphasizes simran (ਸਿਮਰਨ, meditation and remembrance of the teachings of Gurus), which can be expressed musically through kirtan, or internally through naam japna ('meditation on His name') as a means to feel God's presence. It teaches followers to transform the "Five Thieves" (i.e. lust, rage, greed, attachment, and ego).

The religion developed and evolved in times of religious persecution, gaining converts from both Hinduism and IslamMughal rulers of India tortured and executed two of the Sikh gurus—Guru Arjan (1563–1605) and Guru Tegh Bahadur (1621–1675)—after they refused to convert to Islam. The persecution of Sikhs triggered the founding of the Khalsa by Guru Gobind Singh in 1699 as an order to protect the freedom of conscience and religion, with members expressing the qualities of a Sant-Sipāhī ('saint-soldier').

The Nicaraguan revolution: a brief history.

 The Nicaraguan Revolution (Spanish: Revolución Nicaragüense or Revolución Popular Sandinista) encompassed the rising opposition to the Somoza dictatorship in the 1960s and 1970s, the campaign led by the Sandinista National Liberation Front (FSLN) to oust the dictatorship in 1978–79, the subsequent efforts of the FSLN to govern Nicaragua from 1979 to 1990, and the Contra War, which was waged between the FSLN-led government of Nicaragua and the United States-backed Contras from 1981–1990. The revolution marked a significant period in the history of Nicaragua and revealed the country as one of the major proxy war battlegrounds of the Cold War, attracting much international attention.


The initial overthrow of the Somoza regime in 1978–79 was a bloody affair, and the Contra War of the 1980s took the lives of tens of thousands of Nicaraguans and was the subject of fierce international debate. During the 1980s, both the FSLN (a leftist collection of political parties) and the Contras (a rightist collection of counter-revolutionary groups) received large amounts of aid from the Cold War superpowers (respectively, the Soviet Union and the United States).

Peace process started with Sapoá Accords in 1988 and the Contra War ended after the signing of the Tela Accord in 1989 and the demobilization of the FSLN and Contra armies. A second election in 1990 resulted in the election of a majority of anti-Sandinista parties and the FSLN handing over power.

Monday, 13 September 2021

Modern day animism?

 

The Miracle Worker: How Darwinism Dishonors the Enlightenment

Neil Thomas

Editor’s note: This article is excerpted from Taking Leave of Darwin: A Longtime Agnostic Discovers the Case for Design, by Neil Thomas, newly released by Discovery Institute Press.

Life on Earth has traditionally and for good reason been termed the mystery of mysteries, and there is much to ponder in the contention of 19th-century Harvard professor Louis Agassiz that life’s mysteries were no nearer to being solved after the publication of The Origin of Species than they had been before it. Needless to say, neither Darwin nor Wallace would have been minded to see things in that way, because both were responding to their own internalized challenges in the competition to find something no one else had been able to find in generations of evolutionary thinking: a cogent explanation for the diversification of life that made no appeal to a directly interventionist natural theology.1 Both men pressed on, acting as much in the spirit of conquistadors as discoverers. (Darwin famously rushed out the publication of Origin when he feared Wallace might pip him to the post.)

Their goal, albeit not of course explicitly acknowledged (in deference to that laudable Victorian code of gentlemanly reticence which has fallen into disuse in the last half century), was to become recognized as the Lyell of Biology. They thereby hoped to establish the prestige of a discipline which, they were determined, should slough off old-fashioned and discredited biblical notions and so place biology within the prestigious sphere of “pure” science. By analogy with Lyell’s geological work, which had rendered the biblical flood superfluous, the quest of Darwin and Wallace was to render the Christian God superfluous to the rolling out of the universe after the moment of Creation, or at least after the appearance of the first self-reproducing organism. But in order to advance their new paradigm, they were obliged to transfer agency to the process of natural selection which, unfortunately, contained within it an insurmountable problem, as Wallace later acknowledged.

Two Problems with Natural Selection

Actually there were two problems. The less fundamental but hardly trivial one was the lack of empirical evidence for the power of natural selection to generate new forms. Darwin appealed to the success of plant and animal breeders to fill the evidential gap, but even setting aside that artificial selection is purposive and therefore an odd stand-in for a mindless process, there is also what Bishop “Soapy Sam” Wilberforce underscored. Contrary to the common caricature of him as an unctuous and obscurantist buffoon, Wilberforce was an Oxford First in Mathematics with a keen interest in natural history and a good working knowledge of animal breeding methods. In his 1860 review of the Origin, he noted that domesticate breeders never succeed in breeding a fundamentally new animal form, and what progress they do make always comes with trade-offs. “The bull-dog gains in strength and loses in swiftness; the greyhound gains in swiftness but loses in strength,” wrote Wilberforce. “Even the English race-horse loses much which would enable it in the battle of life to compete with its rougher ancestor.”2

The more fundamental problem is that the “agency” of natural selection invoked by Darwin and, less expansively, by Wallace,3 was held to operate unselectively, with no notion of purpose permitted to obtrude into the multiple revolutions of its biological lottery. It’s not unlike the bonkers situation of a car salesman marketing a car without an engine underneath its hood, the fellow assuring his customer that the car would nevertheless function perfectly well. 

Darwin wanted to have his cake and eat it too. Natural selection is a mindless process; Darwin was adamant about that. Yet he habitually repaired to purposive terminology in his descriptions of it, as when he limned natural selection’s construction of the eye: “We must suppose that there is a power always intently watching each slight accidental alteration of the transparent layers [of the eye]; and carefully selecting each alteration which, under varied circumstances, may in any way, or in any degree, tend to produce a distincter image.”4

One might legitimately ask, how it is possible to “intently watch” and “carefully select” unintelligently? That is entirely discrepant with what Darwin elsewhere claimed for the process he invoked. The contradiction points to a more than trivial conceptual confusion, and I would surmise that the very phraseology Darwin uses reveals that he must have had some awareness of the illogicality of his own position, even if only at some barely conscious level of apprehension.

An Apartheid-Like Antipathy

In the course of my professional life in higher education, an area that proudly regards itself as inclusive, it has often struck me as disturbingly contrary to that ideal that some colleagues in science departments betray an almost apartheid-like antipathy to philosophers, with philosophy coming second only to theology in the demonological hierarchy of some of their number. My vague understanding of this antipathy has, however, become much clarified in the course of preparing this volume. For it is philosophers in particular who have typically been the ones responsible for calling out many “mad genius” ideas put forward by representatives of the scientific community — a task for which said philosophers have received few thanks, needless to say — and which doubtless explains something of the animus towards them.

The problem with “natural selection” for philosopher Antony Flew was that it no more resembles any kind of conscious selection procedure than “Bombay duck is a species of duck.”5 It has been described as a would-be materialistic although in reality miraculous explanation. As Le Fanu put it, “Darwin’s explanation was in its own way profoundly ‘metaphysical’ in that it attributed to natural selection powers that might reasonably be thought to be miraculous — that it should somehow fashion perfection from a blind, random process, and transform one class of animal into another.”6

Again, we come up against the difficulty of scientists having to impute creative powers to phenomena with no creative capacity, rather like the way Richard Dawkins anthropomorphizes genes as being “selfish” — apportioning perception and decision to inanimate entities quite incapable of any decision or action whatsoever, selfish or unselfish, as philosopher Mary Midgley and others have pointed out,7 some going so far as to accuse him of animism.

Rationalist Principles and the Darwinian Narrative

If the reigning materialist paradigm had even a tolerably convincing weight of evidence behind it, I would be the first to accept it. In fact, I would embrace it wholeheartedly and with a sense of relief, even closure, since it would provide an excellent fit with a prior educational formation which has habitually foregrounded rational, evidence-based criteria. However, it is those very rationalist principles which bid me reject the Darwinian narrative, in its original, neo-Darwinian, and extended manifestations. I find it the grandest historical irony that the most fervent defenders of Darwinism claim to be advancing the ideals of the European Enlightenment. My view is that they are in reality dishonoring the foundational principles of that admirable project by perpetuating a hypothesis without empirical foundation or even the slightest approximation to verisimilitude.

As philosopher Richard Spilsbury once noted, “The basic objection to neo-Darwinism is not that it is speculative, but that it confers miraculous powers on inappropriate agents. In essence, it is an attempt to supernaturalize nature, to endow unthinking processes with more-than-human powers.”8

The case might even be made that the Darwinian narrative can work only by implicitly disregarding the Enlightenment program through its appeal to ways of thought supposed to have died out countless centuries before Darwin was even born. By that I mean that to attribute creative potential to nature itself is a deeply archaic, animistic way of thinking which takes us back even to the paganism of the Homeric age.

In the imaginative works of those early eras, nature through its many deified incarnations is routinely credited with directive capability. Zeus, called the Thunderer by the poet Hesiod in his Theogony, was believed to be able, inter alia, to control the weather; Demeter, the fertility goddess, could exert an influence on the annual crop yield; Aeolus, Keeper of the Winds in the Odyssey, provides a gentle breeze to waft Odysseus back to Ithaca after his long travels. To the ancient Greeks and many peoples who preceded them, the gods were essentially personifications of different aspects of Nature itself. The pre-scientific mind imputed agency to Nature by way of the personification of Nature’s various aspects as individual divinities.

Darwin’s theory of natural selection, although it struck most at the time and even since as an intellectual innovation, appears in reality to be something of a throw-back to those earlier modes of thought. In what seems to be a confirmation of the “nothing new under the sun” adage, Darwin appears, wittingly or not, to have channeled the spirit of the older, polytheistic world by crediting Nature with an infinite number of transformative powers.

Notes

  1. For Wallace at least it would be going too far to say he was long intent on an anti-theistic model. In his 1856 article titled “On the Habits of the Orang-Utan of Borneo,” in Annals and Magazine of Natural History 2nd ser., vol. 17, no. 103, he is already hinting at the theistic direction he was contemplating and would eventually take. Historian Michael Flannery writes, “It is no exaggeration to see this 1856 essay, written in the wake of his Sarawak Law paper the year before and ahead of his famous Ternate letter, as an early creedal statement. It would mark the emergent tenets of his inchoate teleological worldview, which consisted of the following: a non-reductionist, holistic view of nature; an admission of inutility in the plant and animal kingdoms and this given as reasonable evidence of higher and even intelligent causation in nature; a special place for humankind in the appreciation of features beyond mere survival utility such as beauty of form, color, and majesty; and the allowance that all of this may be the intentional expression of a theistic presence or force.” Flannery, Nature’s Prophet: Alfred Russel Wallace and His Evolution from Natural Selection to Natural Theology (Tuscaloosa, AL: University of Alabama Press, 2018), 64.
  2. Samuel Wilberforce, “On the Origin of Species,” Quarterly Review (1860), 237–238.
  3. Regarding natural selection, historian Michael Flannery notes (in private correspondence) that a major point of difference that came to separate Wallace from Darwin was the question of artificial selection’s evidential import. In On the Origin of Species Darwin offered breeding examples as analogous to natural selection. Wallace, from his Ternate paper (1858) on, clearly distinguished between the two, and by implication limited the explanatory power of natural selection. We find this in his paper presented to the Anthropological Society of London in 1864, and it led to his open break with Darwin in 1869.
  4. Darwin, On the Origin of Species, 141–142.
  5. Antony Flew, Darwinian Evolution, 2nd ed. (London: Transaction Publishers, 1997), 25. Bombay duck is a gastronomic delicacy composed of dry, salted fish.
  6. Le Fanu, Why Us?, 107.
  7. On this point see now John Hands, Cosmosapiens: Human Evolution from the Origin of the Universe (London: Duckworth, 2015), 382.
  8. Richard Spilsbury, Providence Lost: A Critique of Darwinism (Oxford: Oxford University Press, 1974), 19.

On Darwinism's 'simple' beginning.

 

The First “Simple” Self-Replicator?

Although no one really claims that science can yet explain how the first living thing arose on Earth, the point of view of many scientists, including Darwin, has been: we can explain how the first simple living things evolved into advanced life forms, including humans, so explaining how life began is a relatively small problem that will surely be solved eventually. 

I do not believe scientists actually have any clue as to how the first living things evolved into intelligent, conscious human beings (see my video “Why Evolution Is Different“). However, to appreciate that the first step is not a small problem you only have to realize that with all our advanced technology we are still not close to designing any type of self-replicating machine. That is still pure science fiction. So how could we imagine that such a machine could have arisen by pure chance?

A Self-Replicating Box

To understand why human-engineered self-replicating machines are so far beyond current human technology, let’s imagine trying to design something as “simple” as a self-replicating cardboard box. Let’s place an empty cardboard box (A) on the floor, and to the right of it let’s construct a box (B) with a box-building factory inside it. I’m not sure exactly what the new box would need to build an empty box, but I assume it would at least have to have some metal parts to cut and fold the cardboard and a motor with a battery to power these parts. In reality, to be really self-replicating like living things, it would have to go get its own cardboard, so maybe it would need wheels and an axe to cut down trees and a small sawmill to make cardboard out of wood. But let’s be generous and assume humans are still around to supply the cardboard. Well, of course box B is not a self-replicating machine, because it only produces an empty box A. 

So, to the right of this box, let’s build another box C which contains a fully automated factory that can produce box B’s. This is a much more complicated box, because this one must manufacture the metal parts for the machinery in box B and its motor and battery and assemble the parts into the factory inside B. In reality it needs to go mine some ore and smelt it to produce these metal parts, but again let’s be very generous and provide it all the metals and other raw materials it needs. 

But box C would still not be a self-replicating machine, because it only produces the much simpler box B. So back to work, now we need to build a box D to its right with a fully automated factory capable of building box C’s with their box B factories. Well, you get the idea, and one begins to wonder if it is even theoretically possible to build a truly self-replicating machine. When we add technology to such a machine to bring it closer to the goal of reproduction, we only move the goalposts, because now we have a more complicated machine to reproduce. Yet we see such machines all around us in the living world. 

Keep Adding Boxes

If we keep adding boxes to the right, each with a fully automated factory that can produce the box to its left, it seems to me that the boxes would grow exponentially in complexity. But maybe I am wrong. Maybe they could be designed to converge eventually to a self-replicating box Z, although I can’t imagine how.

As I said in “A Summary of the Evidence for Intelligent Design,” if engineers someday do construct a box Z self-replicator, I’m sure it will not happen until long after I am gone, if it ever happens at all. And I’m sure it will not — to put it mildly — be simple. Nevertheless, I can confidently predict that if it ever happens, origin-of-life researchers will announce to the world that science has finally shown that life could have arisen through natural processes. In fact, it will have shown that it could have arisen only through design. 

Guy Fawkes: a brief history.

 Guy Fawkes , also known as Guido Fawkes while fighting for the Spanish, was a member of a group of provincial English Catholics who was involved in the failed Gunpowder Plot of 1605. He was born and educated in York; his father died when Fawkes was eight years old, after which his mother married a recusant Catholic.


Fawkes converted to Catholicism and left for mainland Europe, where he fought for Catholic Spain in the Eighty Years' War against Protestant Dutch reformers in the Low Countries. He travelled to Spain to seek support for a Catholic rebellion in England without success. He later met Thomas Wintour, with whom he returned to England. Wintour introduced him to Robert Catesby, who planned to assassinate King James I and restore a Catholic monarch to the throne. The plotters leased an undercroft beneath the House of Lords; Fawkes was placed in charge of the gunpowder that they stockpiled there. The authorities were prompted by an anonymous letter to search Westminster Palace during the early hours of 5 November, and they found Fawkes guarding the explosives. He was questioned and tortured over the next few days and confessed to wanting to blow up the House of Lords.

Immediately before his execution on 31 January, Fawkes fell from the scaffold where he was to be hanged and broke his neck, thus avoiding the agony of being hanged, drawn and quartered. He became synonymous with the Gunpowder Plot, the failure of which has been commemorated in the UK as Guy Fawkes Night since 5 November 1605, when his effigy is traditionally burned on a bonfire, commonly accompanied by fireworks.

Nature's acrobats vs. Darwin.

 

Squirrel Acrobatics Amaze Scientists

Evolution News DiscoveryCSC

The other day, walking down a tree-lined street, we were startled by a sudden crash and scramble at our feet. A squirrel had fallen from a tree! It quickly recovered and dashed away, back up the same tree trunk from which it had come. That never — or almost never — happens. Why not? Scientists at UC Berkeley have been investigating just that question.

The desire for a peanut is enough to make a squirrel give an Olympic performance. The scientists practically held up “9” and “10” score signs as they watched their guest squirrels run an obstacle course the researchers set up. The squirrels, which live in eucalyptus trees on the campus, even used creative moves when the course was made more difficult. This story shows that good-old empirical science still has the power to fascinate and build understanding, without a need for Darwinian storytelling.

How many movies show a protagonist running from danger, having to make a split-second decision whether to leap over a gap? The scene from Raiders of the Lost Ark showing Indiana Jones running from a rolling boulder, leaping over a chasm, and hanging by his fingernails comes to mind. For squirrels leaping from branch to branch in the trees, this is par for the course. Speaking of that, they also know parkour — the ability to bounce off a wall for extra oomph in a dangerous leap. Parkour is a popular urban sport highlighted in many YouTube videos, where a runner tries to leap from rooftop to rooftop through a rapid series of daring leaps without assistance. It is as dangerous as it looks, but some get really good at it, performing flips and twists in some of the riskier sequences. 

UC Berkeley biologists Nathaniel Hunt and Robert Full, with help from two colleagues in the psychology department, set up the outdoor experiments on campus. A quick video shows the setup and some of the squirrel performances in slow motion.

The results made the cover story in Science on August 6 with the title, “Acrobatic squirrels learn to leap and land on tree branches without falling.” It begins with alacrity:

Every day, there are acrobatic extravaganzas going on above our heads. Squirrels navigate remarkably complex and unpredictable environments as they leap from branch to branch, and mistakes can be fatal. These feats require a complex combination of evolved biomechanical adaptations and learned behaviors. Hunt et al. characterized the integration of these features in a series of experiments with free-living fox squirrels (see the Perspective by Adolph and Young). They found that the squirrels’ remarkable and consistent success was due to a combination of learned impulse generation when assessing the balance between distance and branch flexibility and the addition of innovative leaps and landings in the face of increasingly difficult challenges. [Emphasis added.]

Parkour Engineering Specifications

A novice is not going to nail a new parkour move on the first try. Squirrels, however, seem to be born with ability to do it. A combination of traits is necessary: a flexible body, good senses, strength, agility, rapid decision-making ability, instinct, and ability to learn. The UCB research team seems vague about the ratios of these specifications. Which traits are the most critical for success? Lack of any one of them could prove fatal. It looks like an irreducibly complex set of specs is required for parkour, no matter which animal does it. This is the case for monkeys, apes, lizards, cursorial birds, and the deer bouncing over the truck hood and trotting off.

In their Perspective article about the research, “Learning to Move in the Real World,” Adolph and Young bring up another specification: the ability to adjust quickly to internal changes. Animals gain weight; females become pregnant. The body during growth and development is continuously changing mass. This happens to human infants as well, as every parent knows:

During development, new affordances emerge as animals’ bodies, skills, and effective environments change. Human infants can grow up to 2 cm in a single day. One week, babies are crawlers; the next, they are walkers — yesterday, objects on the coffee table were out of sight and beyond reach; today, they are accessible. Thus, learning occurs in the context of development, and the flux of body growth and motor-skill acquisition ensures that infants do not learn fixed solutions. Indeed, static solutions would be maladaptive in a continually changing ecosystem. Instead, infants “learn to learn.” They learn to detect information for affordances at each moment to determine which actions are possible with their current body and skills in a given environment.

Think Fast!

In the experiments, the team gave the squirrels challenges that required calculating risks and rewards. To get to the nut, a squirrel had to negotiate a narrow flexible strip and then leap to a post. Squirrels instinctively knew whether the strip could support their weight and was stiff enough for the launch. If the strip was sufficiently stable, they would crawl out to the end and jump. If not, they would start their jump farther back with more energy. Sometimes they would “parkour” off the vertical wall to get to the post. If they overshot or undershot, they had a backup plan: they knew their claws could save them. They would grab the bar and flip over or under it and land on top like a star gymnast. Most of the time, they nailed the landing with all four feet fitting on the tight platform provided. The scientists were amazed at their quick calculations.

“They’re not always going to have their best performance — they just have to be good enough,” he said. “They have redundancy. So, if they miss, they don’t hit their center of mass right on the landing perch, they’re amazing at being able to grab onto it. They’ll swing underneath, they’ll swing over the top. They just don’t fall.

Well, almost never. And that adds another specification: an excellent kinesthetic sense. The animal must know its body’s strengths and weaknesses, its position, and its current weight. If a mother squirrel is leaping while carrying a baby, it must calculate its resources without error each time.

An Engineering Perspective

The “acrobatic extravaganzas” going on all around us in the living world are easy to take for granted. An engineering perspective helps unpack the requirements. What must be true for this phenomenon to occur? Evolutionists defame the achievements of squirrels and humans with their dismissive statements that phenomena just evolved. The UCB scientists seem to recognize the requirements, but attribute them to evolution anyway:

Gap traversibility depends on the complement of environmental properties with an animal’s locomotive capacities. The synergy between biomechanical energy management and learned information for launching and landing likely determines arboreal leaping and ultimately the path through the canopy. The role of fast and accurate leaping in driving the evolution of biomechanical capabilities, learning-based decision-making, and innovation promises to reveal the mechanisms and origins of arboreal agility

In that last sentence, they admit they cannot account for the origin of these mechanisms. So how can they speak of “the evolution of biomechanical capabilities”? Their worldview forces them to imagine an unguided origin, because at some earlier time in their view animals did not have these biomechanical capabilities. How and when did they emerge? What good is one specification when the others are not yet present?

It’s the engineering perspective that intelligent design offers that brings light to these questions. A complex ability like leaping over a gap to a reward presupposes a set of specifications. Each spec is measurable: for a squirrel mass m, needing to clear a gap of distance d, launching from a platform with springiness x, an engineer can calculate the force needed and test it with robotics. More variables can be specified for in-flight correction (parkour moves) and claw strength for grasping and swinging. The specs are likely to get increasingly complicated when requirements for sensing and balance are considered. But this is science that is precise, measurable, and testable. It also has explanatory value: once the requirements are known, the set of causes necessary and sufficient to achieve them can be evaluated.

Lead author Nathaniel Hunt, who also works in the Department of Biomechanics at the University of Nebraska, should welcome mechanical engineers to his team. It’s not a long leap to a Department of Biomechanical Engineering. That, not evolution, is what would promise “to reveal the mechanisms and origins of arboreal agility.”

How is Jesus Christ the firstborn of creation?

 Colossians1:15 NASB "He is the image of the invisible God, the firstborn of all creation. "

Clearly this scripture is indicating that Christ is the first of Jehovah's creatures.No? Our dear friends in Christendom insists that we are misinterpreting the verse.Once again I am forced to point out their failure to be consistent with their interpretive logic.
 If someone could point to a single scripture where the firstborn (prototokos) of a set is not a member of that set I would at least have some grounds to reassess what seems like more special pleading from Christendom's apologists but thus far in my innumerable discussions on the subject all I get is a lot of hand-waving and irrelevant referrences to scriptures like 89:27 .David is of course a member of the set of anointed kings and in point of fact literally the first Judean king i.e the first of the line of kings leading to Jehovah's greatest messiah.
  So whether the first born is first in rank or first in number in the Holy scripture he is ALWAYS the first of the set.As tends to be the case they generally have no problem applying sound interpretive logic when no cherished preconceptions are on the line,e.g Revelation1:5 NASB "and from Jesus Christ, the faithful witness, the firstborn of the dead,..." few would argue that this must mean that Jesus was never resurrected,even though jesus is the one through whom Jehovah resurrects the rest of the dead see 1Corinthians 15:21. So once again our interpretive logic is consistent ,that of our would be instructors not so much.