Search This Blog

Tuesday 14 September 2021

Deism: a brief history.

 Deism (/ˈdɪzəm/ DEE-iz-əm[1][2] or /ˈd.ɪzəm/ DAY-iz-əm; derived from Latin deus, meaning "god")[3] is the philosophical position and rationalistic theology[4] that rejects revelation as a source of divine knowledge, and asserts that empirical reason and observation of the natural world are exclusively logical, reliable, and sufficient to determine the existence of a Supreme Being as the creator of the universe.[3][4][5][6][7][8] Deism is also defined as the belief in the existence of God solely based on rational thought, without any reliance on revealed religions or religious authority.[3][4][5][6][7] Deism emphasizes the concept of natural theology, that is, God's existence is revealed through nature.[3][4][5][6][8]

Since the 17th century and during the Age of Enlightenment, especially in 18th-century England and France, various Western philosophers and theologians formulated a critical rejection of the religious texts belonging to the many institutionalized religions and began to appeal only to truths that they felt could be established by reason alone as the exclusive source of divine knowledge.[4][5][6][7] Such philosophers and theologians were called "Deists", and the philosophical/theological position that they advocated is called "Deism".[4][5][6][7] Deism as a distinct philosophical and intellectual movement declined towards the end of the 18th century.[4] Some of its tenets continued to live on as part of other intellectual movements, like Unitarianism, and it continues to have advocates today.[3]

Atheism: a brief history.

 Atheism, in the broadest sense, is an absence of belief in the existence of deities. Less broadly, atheism is a rejection of the belief that any deities exist. In an even narrower sense, atheism is specifically the position that there are no deities. Atheism is contrasted with theism, which in its most general form is the belief that at least one deity exists.


The etymological root for the word atheism originated before the 5th century BCE from the ancient Greek ἄθεος (atheos), meaning "without god(s)". In antiquity, it had multiple uses as a pejorative term applied to those thought to reject the gods worshiped by the larger society, those who were forsaken by the gods, or those who had no commitment to belief in the gods. The term denoted a social category created by orthodox religionists into which those who did not share their religious beliefs were placed. The actual term atheism emerged first in the 16th century. With the spread of freethoughtskeptical inquiry, and subsequent increase in criticism of religion, application of the term narrowed in scope. The first individuals to identify themselves using the word atheist lived in the 18th century during the Age of Enlightenment. The French Revolution, noted for its "unprecedented atheism", witnessed the first significant political movement in history to advocate for the supremacy of human reason.

Arguments for atheism range from philosophical to social and historical approaches. Rationales for not believing in deities include the lack of evidence, the problem of evil, the argument from inconsistent revelations, the rejection of concepts that cannot be falsified, and the argument from nonbelief. Nonbelievers contend that atheism is a more parsimonious position than theism and that everyone is born without beliefs in deities; therefore, they argue that the burden of proof lies not on the atheist to disprove the existence of gods but on the theist to provide a rationale for theism. Although some atheists have adopted secular philosophies (e.g. secular humanism), there is no ideology or code of conduct to which all atheists adhere.

Since conceptions of atheism vary, accurate estimations of current numbers of atheists are difficult. According to global Win-Gallup International studies, 13% of respondents were "convinced atheists" in 2012, 11% were "convinced atheists" in 2015, and in 2017, 9% were "convinced atheists". However, other researchers have advised caution with WIN/Gallup figures since other surveys which have used the same wording for decades and have a bigger sample size have consistently reached lower figures. An older survey by the British Broadcasting Corporation (BBC) in 2004 recorded atheists as comprising 8% of the world's population. Other older estimates have indicated that atheists comprise 2% of the world's population, while the irreligious add a further 12%. According to these polls, Europe and East Asia are the regions with the highest rates of atheism. In 2015, 61% of people in China reported that they were atheists. The figures for a 2010 Eurobarometer survey in the European Union (EU) reported that 20% of the EU population claimed not to believe in "any sort of spirit, God or life force", with France (40%) and Sweden (34%) representing the highest values.

Sikhism: a brief history.

 Sikhism or Sikhi (Punjabi: ਸਿੱਖੀ Sikkhī, [ˈsɪkːʰiː], from ਸਿੱਖ, Sikh, 'disciple', 'seeker', or 'learner') is an Indian Dharmic religion that originated in the Punjab region of the Indian subcontinent around the end of the 15th century CE. Sikhism is one of the youngest of the major religions and the world's fifth-largest organized religion, with about 25–30 million Sikhs as of the early 21st century . However, according to rough estimates, there are around 120–150 million (12–15 crore) Sahajdhari or non-khalsa Nanakpanthi sikhs across the world who also believe in 10 Sikh Gurus and Guru Granth Sahib.


Sikhism developed from the spiritual teachings of Guru Nanak, the first Guru (1469–1539), and of the nine Sikh gurus who succeeded him. The tenth guru, Gobind Singh (1666–1708), named the Sikh scripture Guru Granth Sahib as his successor, bringing to a close the line of human gurus and establishing the scripture as the last eternal 11th living guru, a religious spiritual/life guide for Sikhs. Guru Nanak taught that living an "active, creative, and practical life" of "truthfulness, fidelity, self-control and purity" is above metaphysical truth, and that the ideal man "establishes union with God, knows His Will, and carries out that Will". Guru Hargobind, the sixth Sikh Guru (1606–1644), established the concept of mutual co-existence of the miri ('political'/'temporal') and piri ('spiritual') realms.

The Sikh scripture opens with the Mul Mantar (ਮੂਲ ਮੰਤਰ), fundamental prayer about ik onkar (ੴ, 'One God'). The core beliefs of Sikhism, articulated in the Guru Granth Sahib, include faith and meditation on the name of the one creator; divine unity and equality of all humankind; engaging in seva ('selfless service'); striving for justice for the benefit and prosperity of all; and honest conduct and livelihood while living a householder's life. Following this standard, Sikhism rejects claims that any particular religious tradition has a monopoly on Absolute Truth.

Sikhism emphasizes simran (ਸਿਮਰਨ, meditation and remembrance of the teachings of Gurus), which can be expressed musically through kirtan, or internally through naam japna ('meditation on His name') as a means to feel God's presence. It teaches followers to transform the "Five Thieves" (i.e. lust, rage, greed, attachment, and ego).

The religion developed and evolved in times of religious persecution, gaining converts from both Hinduism and IslamMughal rulers of India tortured and executed two of the Sikh gurus—Guru Arjan (1563–1605) and Guru Tegh Bahadur (1621–1675)—after they refused to convert to Islam. The persecution of Sikhs triggered the founding of the Khalsa by Guru Gobind Singh in 1699 as an order to protect the freedom of conscience and religion, with members expressing the qualities of a Sant-Sipāhī ('saint-soldier').

The Nicaraguan revolution: a brief history.

 The Nicaraguan Revolution (Spanish: Revolución Nicaragüense or Revolución Popular Sandinista) encompassed the rising opposition to the Somoza dictatorship in the 1960s and 1970s, the campaign led by the Sandinista National Liberation Front (FSLN) to oust the dictatorship in 1978–79, the subsequent efforts of the FSLN to govern Nicaragua from 1979 to 1990, and the Contra War, which was waged between the FSLN-led government of Nicaragua and the United States-backed Contras from 1981–1990. The revolution marked a significant period in the history of Nicaragua and revealed the country as one of the major proxy war battlegrounds of the Cold War, attracting much international attention.


The initial overthrow of the Somoza regime in 1978–79 was a bloody affair, and the Contra War of the 1980s took the lives of tens of thousands of Nicaraguans and was the subject of fierce international debate. During the 1980s, both the FSLN (a leftist collection of political parties) and the Contras (a rightist collection of counter-revolutionary groups) received large amounts of aid from the Cold War superpowers (respectively, the Soviet Union and the United States).

Peace process started with Sapoá Accords in 1988 and the Contra War ended after the signing of the Tela Accord in 1989 and the demobilization of the FSLN and Contra armies. A second election in 1990 resulted in the election of a majority of anti-Sandinista parties and the FSLN handing over power.

Monday 13 September 2021

Modern day animism?

 

The Miracle Worker: How Darwinism Dishonors the Enlightenment

Neil Thomas

Editor’s note: This article is excerpted from Taking Leave of Darwin: A Longtime Agnostic Discovers the Case for Design, by Neil Thomas, newly released by Discovery Institute Press.

Life on Earth has traditionally and for good reason been termed the mystery of mysteries, and there is much to ponder in the contention of 19th-century Harvard professor Louis Agassiz that life’s mysteries were no nearer to being solved after the publication of The Origin of Species than they had been before it. Needless to say, neither Darwin nor Wallace would have been minded to see things in that way, because both were responding to their own internalized challenges in the competition to find something no one else had been able to find in generations of evolutionary thinking: a cogent explanation for the diversification of life that made no appeal to a directly interventionist natural theology.1 Both men pressed on, acting as much in the spirit of conquistadors as discoverers. (Darwin famously rushed out the publication of Origin when he feared Wallace might pip him to the post.)

Their goal, albeit not of course explicitly acknowledged (in deference to that laudable Victorian code of gentlemanly reticence which has fallen into disuse in the last half century), was to become recognized as the Lyell of Biology. They thereby hoped to establish the prestige of a discipline which, they were determined, should slough off old-fashioned and discredited biblical notions and so place biology within the prestigious sphere of “pure” science. By analogy with Lyell’s geological work, which had rendered the biblical flood superfluous, the quest of Darwin and Wallace was to render the Christian God superfluous to the rolling out of the universe after the moment of Creation, or at least after the appearance of the first self-reproducing organism. But in order to advance their new paradigm, they were obliged to transfer agency to the process of natural selection which, unfortunately, contained within it an insurmountable problem, as Wallace later acknowledged.

Two Problems with Natural Selection

Actually there were two problems. The less fundamental but hardly trivial one was the lack of empirical evidence for the power of natural selection to generate new forms. Darwin appealed to the success of plant and animal breeders to fill the evidential gap, but even setting aside that artificial selection is purposive and therefore an odd stand-in for a mindless process, there is also what Bishop “Soapy Sam” Wilberforce underscored. Contrary to the common caricature of him as an unctuous and obscurantist buffoon, Wilberforce was an Oxford First in Mathematics with a keen interest in natural history and a good working knowledge of animal breeding methods. In his 1860 review of the Origin, he noted that domesticate breeders never succeed in breeding a fundamentally new animal form, and what progress they do make always comes with trade-offs. “The bull-dog gains in strength and loses in swiftness; the greyhound gains in swiftness but loses in strength,” wrote Wilberforce. “Even the English race-horse loses much which would enable it in the battle of life to compete with its rougher ancestor.”2

The more fundamental problem is that the “agency” of natural selection invoked by Darwin and, less expansively, by Wallace,3 was held to operate unselectively, with no notion of purpose permitted to obtrude into the multiple revolutions of its biological lottery. It’s not unlike the bonkers situation of a car salesman marketing a car without an engine underneath its hood, the fellow assuring his customer that the car would nevertheless function perfectly well. 

Darwin wanted to have his cake and eat it too. Natural selection is a mindless process; Darwin was adamant about that. Yet he habitually repaired to purposive terminology in his descriptions of it, as when he limned natural selection’s construction of the eye: “We must suppose that there is a power always intently watching each slight accidental alteration of the transparent layers [of the eye]; and carefully selecting each alteration which, under varied circumstances, may in any way, or in any degree, tend to produce a distincter image.”4

One might legitimately ask, how it is possible to “intently watch” and “carefully select” unintelligently? That is entirely discrepant with what Darwin elsewhere claimed for the process he invoked. The contradiction points to a more than trivial conceptual confusion, and I would surmise that the very phraseology Darwin uses reveals that he must have had some awareness of the illogicality of his own position, even if only at some barely conscious level of apprehension.

An Apartheid-Like Antipathy

In the course of my professional life in higher education, an area that proudly regards itself as inclusive, it has often struck me as disturbingly contrary to that ideal that some colleagues in science departments betray an almost apartheid-like antipathy to philosophers, with philosophy coming second only to theology in the demonological hierarchy of some of their number. My vague understanding of this antipathy has, however, become much clarified in the course of preparing this volume. For it is philosophers in particular who have typically been the ones responsible for calling out many “mad genius” ideas put forward by representatives of the scientific community — a task for which said philosophers have received few thanks, needless to say — and which doubtless explains something of the animus towards them.

The problem with “natural selection” for philosopher Antony Flew was that it no more resembles any kind of conscious selection procedure than “Bombay duck is a species of duck.”5 It has been described as a would-be materialistic although in reality miraculous explanation. As Le Fanu put it, “Darwin’s explanation was in its own way profoundly ‘metaphysical’ in that it attributed to natural selection powers that might reasonably be thought to be miraculous — that it should somehow fashion perfection from a blind, random process, and transform one class of animal into another.”6

Again, we come up against the difficulty of scientists having to impute creative powers to phenomena with no creative capacity, rather like the way Richard Dawkins anthropomorphizes genes as being “selfish” — apportioning perception and decision to inanimate entities quite incapable of any decision or action whatsoever, selfish or unselfish, as philosopher Mary Midgley and others have pointed out,7 some going so far as to accuse him of animism.

Rationalist Principles and the Darwinian Narrative

If the reigning materialist paradigm had even a tolerably convincing weight of evidence behind it, I would be the first to accept it. In fact, I would embrace it wholeheartedly and with a sense of relief, even closure, since it would provide an excellent fit with a prior educational formation which has habitually foregrounded rational, evidence-based criteria. However, it is those very rationalist principles which bid me reject the Darwinian narrative, in its original, neo-Darwinian, and extended manifestations. I find it the grandest historical irony that the most fervent defenders of Darwinism claim to be advancing the ideals of the European Enlightenment. My view is that they are in reality dishonoring the foundational principles of that admirable project by perpetuating a hypothesis without empirical foundation or even the slightest approximation to verisimilitude.

As philosopher Richard Spilsbury once noted, “The basic objection to neo-Darwinism is not that it is speculative, but that it confers miraculous powers on inappropriate agents. In essence, it is an attempt to supernaturalize nature, to endow unthinking processes with more-than-human powers.”8

The case might even be made that the Darwinian narrative can work only by implicitly disregarding the Enlightenment program through its appeal to ways of thought supposed to have died out countless centuries before Darwin was even born. By that I mean that to attribute creative potential to nature itself is a deeply archaic, animistic way of thinking which takes us back even to the paganism of the Homeric age.

In the imaginative works of those early eras, nature through its many deified incarnations is routinely credited with directive capability. Zeus, called the Thunderer by the poet Hesiod in his Theogony, was believed to be able, inter alia, to control the weather; Demeter, the fertility goddess, could exert an influence on the annual crop yield; Aeolus, Keeper of the Winds in the Odyssey, provides a gentle breeze to waft Odysseus back to Ithaca after his long travels. To the ancient Greeks and many peoples who preceded them, the gods were essentially personifications of different aspects of Nature itself. The pre-scientific mind imputed agency to Nature by way of the personification of Nature’s various aspects as individual divinities.

Darwin’s theory of natural selection, although it struck most at the time and even since as an intellectual innovation, appears in reality to be something of a throw-back to those earlier modes of thought. In what seems to be a confirmation of the “nothing new under the sun” adage, Darwin appears, wittingly or not, to have channeled the spirit of the older, polytheistic world by crediting Nature with an infinite number of transformative powers.

Notes

  1. For Wallace at least it would be going too far to say he was long intent on an anti-theistic model. In his 1856 article titled “On the Habits of the Orang-Utan of Borneo,” in Annals and Magazine of Natural History 2nd ser., vol. 17, no. 103, he is already hinting at the theistic direction he was contemplating and would eventually take. Historian Michael Flannery writes, “It is no exaggeration to see this 1856 essay, written in the wake of his Sarawak Law paper the year before and ahead of his famous Ternate letter, as an early creedal statement. It would mark the emergent tenets of his inchoate teleological worldview, which consisted of the following: a non-reductionist, holistic view of nature; an admission of inutility in the plant and animal kingdoms and this given as reasonable evidence of higher and even intelligent causation in nature; a special place for humankind in the appreciation of features beyond mere survival utility such as beauty of form, color, and majesty; and the allowance that all of this may be the intentional expression of a theistic presence or force.” Flannery, Nature’s Prophet: Alfred Russel Wallace and His Evolution from Natural Selection to Natural Theology (Tuscaloosa, AL: University of Alabama Press, 2018), 64.
  2. Samuel Wilberforce, “On the Origin of Species,” Quarterly Review (1860), 237–238.
  3. Regarding natural selection, historian Michael Flannery notes (in private correspondence) that a major point of difference that came to separate Wallace from Darwin was the question of artificial selection’s evidential import. In On the Origin of Species Darwin offered breeding examples as analogous to natural selection. Wallace, from his Ternate paper (1858) on, clearly distinguished between the two, and by implication limited the explanatory power of natural selection. We find this in his paper presented to the Anthropological Society of London in 1864, and it led to his open break with Darwin in 1869.
  4. Darwin, On the Origin of Species, 141–142.
  5. Antony Flew, Darwinian Evolution, 2nd ed. (London: Transaction Publishers, 1997), 25. Bombay duck is a gastronomic delicacy composed of dry, salted fish.
  6. Le Fanu, Why Us?, 107.
  7. On this point see now John Hands, Cosmosapiens: Human Evolution from the Origin of the Universe (London: Duckworth, 2015), 382.
  8. Richard Spilsbury, Providence Lost: A Critique of Darwinism (Oxford: Oxford University Press, 1974), 19.

On Darwinism's 'simple' beginning.

 

The First “Simple” Self-Replicator?

Although no one really claims that science can yet explain how the first living thing arose on Earth, the point of view of many scientists, including Darwin, has been: we can explain how the first simple living things evolved into advanced life forms, including humans, so explaining how life began is a relatively small problem that will surely be solved eventually. 

I do not believe scientists actually have any clue as to how the first living things evolved into intelligent, conscious human beings (see my video “Why Evolution Is Different“). However, to appreciate that the first step is not a small problem you only have to realize that with all our advanced technology we are still not close to designing any type of self-replicating machine. That is still pure science fiction. So how could we imagine that such a machine could have arisen by pure chance?

A Self-Replicating Box

To understand why human-engineered self-replicating machines are so far beyond current human technology, let’s imagine trying to design something as “simple” as a self-replicating cardboard box. Let’s place an empty cardboard box (A) on the floor, and to the right of it let’s construct a box (B) with a box-building factory inside it. I’m not sure exactly what the new box would need to build an empty box, but I assume it would at least have to have some metal parts to cut and fold the cardboard and a motor with a battery to power these parts. In reality, to be really self-replicating like living things, it would have to go get its own cardboard, so maybe it would need wheels and an axe to cut down trees and a small sawmill to make cardboard out of wood. But let’s be generous and assume humans are still around to supply the cardboard. Well, of course box B is not a self-replicating machine, because it only produces an empty box A. 

So, to the right of this box, let’s build another box C which contains a fully automated factory that can produce box B’s. This is a much more complicated box, because this one must manufacture the metal parts for the machinery in box B and its motor and battery and assemble the parts into the factory inside B. In reality it needs to go mine some ore and smelt it to produce these metal parts, but again let’s be very generous and provide it all the metals and other raw materials it needs. 

But box C would still not be a self-replicating machine, because it only produces the much simpler box B. So back to work, now we need to build a box D to its right with a fully automated factory capable of building box C’s with their box B factories. Well, you get the idea, and one begins to wonder if it is even theoretically possible to build a truly self-replicating machine. When we add technology to such a machine to bring it closer to the goal of reproduction, we only move the goalposts, because now we have a more complicated machine to reproduce. Yet we see such machines all around us in the living world. 

Keep Adding Boxes

If we keep adding boxes to the right, each with a fully automated factory that can produce the box to its left, it seems to me that the boxes would grow exponentially in complexity. But maybe I am wrong. Maybe they could be designed to converge eventually to a self-replicating box Z, although I can’t imagine how.

As I said in “A Summary of the Evidence for Intelligent Design,” if engineers someday do construct a box Z self-replicator, I’m sure it will not happen until long after I am gone, if it ever happens at all. And I’m sure it will not — to put it mildly — be simple. Nevertheless, I can confidently predict that if it ever happens, origin-of-life researchers will announce to the world that science has finally shown that life could have arisen through natural processes. In fact, it will have shown that it could have arisen only through design. 

Guy Fawkes: a brief history.

 Guy Fawkes , also known as Guido Fawkes while fighting for the Spanish, was a member of a group of provincial English Catholics who was involved in the failed Gunpowder Plot of 1605. He was born and educated in York; his father died when Fawkes was eight years old, after which his mother married a recusant Catholic.


Fawkes converted to Catholicism and left for mainland Europe, where he fought for Catholic Spain in the Eighty Years' War against Protestant Dutch reformers in the Low Countries. He travelled to Spain to seek support for a Catholic rebellion in England without success. He later met Thomas Wintour, with whom he returned to England. Wintour introduced him to Robert Catesby, who planned to assassinate King James I and restore a Catholic monarch to the throne. The plotters leased an undercroft beneath the House of Lords; Fawkes was placed in charge of the gunpowder that they stockpiled there. The authorities were prompted by an anonymous letter to search Westminster Palace during the early hours of 5 November, and they found Fawkes guarding the explosives. He was questioned and tortured over the next few days and confessed to wanting to blow up the House of Lords.

Immediately before his execution on 31 January, Fawkes fell from the scaffold where he was to be hanged and broke his neck, thus avoiding the agony of being hanged, drawn and quartered. He became synonymous with the Gunpowder Plot, the failure of which has been commemorated in the UK as Guy Fawkes Night since 5 November 1605, when his effigy is traditionally burned on a bonfire, commonly accompanied by fireworks.