Search This Blog

Sunday 13 November 2022

On O.B.Es

What Really Happens During an Out-of-Body Experience? 

Medically reviewed by Nicole Washington, DO, MPH — By Crystal Raypole — Updated on July 22, 2022 

An out-of-body experience is often described as feeling like you’ve left your physical body. There are many potential causes, including several medical conditions and experiences. 

An out-of-body experience (OBE) is a sensation of your consciousness leaving your body. These episodes are often reported by people who’ve had a near-death experience. Some might also describe an OBE as a dissociative episode.


People typically experience their sense of self inside their physical body. You most likely view the world around you from this vantage point. But during an OBE, you may feel as if you’re outside yourself, looking at your body from another perspective.


What really goes on during an OBE? Does your consciousness actually leave your body? Experts aren’t totally sure, but they have a few hunches, which we’ll get into later.

What does an OBE feel like? 

It’s hard to nail down what an OBE feels like, exactly.


According to accounts from people who’ve experienced them, they generally involve:


a feeling of floating outside your body

an altered perception of the world, such as looking down from a height

the feeling that you’re looking down at yourself from above

a sense that what’s happening is very real

OBEs typically happen without warning and usually don’t last for very long.


If you have a neurological condition, such as epilepsy, you may be more likelyTrusted Source to experience OBEs.They may also happen more frequently. But for many people, an OBE will happen very rarely, maybe only once in a lifetime if at all.


Some estimates suggest around 5 percent of people have experienced the sensations associated with an OBE, though some suggest this number may be higher. 

Does anything happen physically? 

There’s some debate over whether the sensations and perceptions associated with OBEs happen physically or as a sort of hallucinatory experience.


A recent 2022 reviewTrusted Source tried to explore this by evaluating a variety of studies and case reports evaluating consciousness, cognitive awareness, and recall in people who survived cardiac arrest.


They noted that some people report experiencing a separation from their body during resuscitation and some even reported an awareness of events they wouldn’t have seen from their actual perspective.


In addition, one study included in the review noted that two participants reported having both visual and auditory experiences while in cardiac arrest. Only one was well enough to follow up, but he gave an accurate, detailed description of what took place for about three minutes of his resuscitation from cardiac arrest.


Still, there’s no scientific evidence to support the idea that a person’s consciousness can actually travel outside the body. 

Veridical perception 

Veridical perception is a controversial concept. It refers to the idea that you can leave your body during an OBE, allowing you to witness something that you may not have otherwise.


Some anecdotal reports of this phenomena exist, with a few people even providingTrusted Source specific, accurate details about events that have happened during surgical procedures or while clinically dead.


Many people use these stories as evidence to support the existence of life after death.


However, the idea of veridicial perception is still limited to anecdotal claims and there is no research available to support it.


One older 2014 studyTrusted Source investigating the validity of veridical perception in people who had survived cardiac arrest found that neither of the two individuals who reported awareness during resuscitation were able to identify specific items that were only viewable from above. 

What can cause them? 

No one’s sure about the exact causes of OBEs, but experts have identified several possible explanations. 

Stress or trauma 

A frightening, dangerous, or difficult situation can provoke a fear response, which might cause you to dissociate from the situation and feel as if you’re an onlooker. This may make you feel as though you are watching the events from somewhere outside your body.


According to 2017 researchTrusted Source reviewing the experience of women in labor, OBEs during childbirth aren’t unusual.


The study didn’t specifically link OBEs to post-traumatic stress disorder, but the authors did point out that women who had OBEs had either gone through trauma during labor or another situation not related to childbirth.


This suggests that OBEs could occur as a way to cope with trauma, but more research is needed on this potential link. 

Medical conditions 

Experts have linked several medical and mental health conditions to OBEs, including:


epilepsy

migraine

cardiac arrest

brain injuries

depression

anxiety

Guillain-Barré syndrome

Dissociative disorders, particularly depersonalization-derealization disorder, can involve frequent feelings or episodes where you seem to be observing yourself from outside your body.


Sleep paralysis has also been noted as a possible cause of OBEs. It refers to a temporary state of waking paralysis that occurs during REM sleep and often involvesTrusted Source hallucinations.


Research suggestsTrusted Source many people who have OBEs with a near-death experience also often experience sleep paralysis.


In addition, a review of literature from 2020 suggests that sleep-wake disturbances may contributeTrusted Source to dissociative symptoms. This can include a feeling of leaving your body. 

Medication and drugs 

Some people report having an OBE while under the influence of anesthesia.


Other substances, including cannabis, ketamine, or hallucinogenic drugs such as LSD, can causeTrusted Source this sensation. 

Near-death experiences 

OBEs can occur during near-death experiences, often alongside other phenomena like flashbacks of previous memories or seeing a light at the end of a tunnel.


Though it’s not clear exactly why this happens, it’s believed to be caused by disruptions in certain areas of the brain involved with processing sensory information. A 2021 reviewTrusted Source suggests that these experiences may be more likely to occur during life threatening situations, which can include:


cardiac arrest

traumatic injury

brain hemorrhage

drowning

suffocation 

Strong G-forces 

Pilots and astronauts sometimes experience OBEs when strong gravitational forces, or G-forces, are encountered. This is because it causesTrusted Source blood to pool in the lower body, which can lead to loss of conscious and may induce an OBE.


Extreme G-forces can also causeTrusted Source spatial disorientation, peripheral vision loss, and disconnection between cognition and the ability to act. 


Paranormal 

Though not backed by research, some people believe that OBEs can occur when your soul or spirit leaves your body.


One form is known as “traveling clairvoyance,” which some mediums claim allows your soul to visit distant locations in order to gain information.


Others believe that certain meditative practices can help you reach a state of consciousness that transcends the body and mind, leading to an OBE.


Some people also experiment with astral projection, which is a spiritual practice that involves making an intentional effort to send your consciousness from your body toward a spiritual plane or dimension.


However, research as not been able to show that these practices cause OBEs. 

Other experiences 

OBEs might be able to be induced, intentionally or accidentally, by:


brain stimulation

sleep deprivation

sensory deprivation

hypnosis or meditative trance

However, additional research is still needed to support this. 

Do out-of-body experiences pose any risks? 

Existing research hasn’t connected experiencing spontaneous OBEs to any serious health risks. In some cases, you might feel a bit dizzy or disoriented after.


However, OBEs and dissociation in general can cause lingering feelings of emotional distress.


You might feel confused over what happened or wonder if you have a brain issue or mental health condition. You might also not like the sensation of an OBE and worry about it happening again.


Some people also claim that it’s possible for your consciousness to remain trapped outside of your body following an OBE, but there’s no evidence to support this. 

Should I see a doctor? 

Simply having an OBE doesn’t necessarily mean you need to see a healthcare professional. You may have this experience once just before drifting off to sleep, for example, and never again. If you don’t have any other symptoms, you probably don’t have any reason for concern.


If you feel uneasy about what happened, even if you don’t have any physical or psychological conditions, there’s no harm in mentioning the experience to a doctor. They may be able to help by ruling out serious conditions or offering some reassurance.


It’s also a good idea to talk with a healthcare professional if you’re having any sleep issues, including insomnia or symptoms of sleep paralysis, such as hallucinations 




Saturday 12 November 2022

Darwinists have got circular argumentation down to a science?

Evolution’s Circular Web of Self-Referencing Literature 

Cornelius Hunter 

Evolutionists believe evolution is true. As justification, they cite previous studies. But those previous studies were done by other evolutionists who, yes, believe evolution is true. The studies do not confirm evolution — they interpret the evidence according to evolutionary theory, no matter how much the evidence contradicts the theory. So, citing those previous studies does little to justify the belief in evolution.


It is a circular web of self-referencing literature. The blind lead the blind. Here is an example. For years Joe Thornton has been claiming proteins evolved. See, for instance, “Simple mechanisms for the evolution of protein complexity,” from Protein Science


As his starting point in the paper, Thornton cites several previous works, falsely claiming that they demonstrate evolution. One of his citations is a paper, “Protein folds, functions and evolution,” from 1999 when I was working on my doctorate in this are


This 1999 paper is cited to support the claim in the Thornton paper that “During the last ~3.8 billion years, evolution has generated proteins with thousands of different folds.” But the 1999 study demonstrates no such thing — not even close. Not controversial, no debate. This is simply a false citation. It is another example of the web of false, self-referencing literature.  

Another Citation 

Here is another citation in the Thornton paper: “Eye evolution and its functional basis,” by Dan Nilsson from 2013, in the journal Visual Neuroscience. This 2013 paper is cited to support the claim in the Thornton paper that the evolution of the vertebrate eye has been proven. But the 2013 Nilsson paper proves no such thing. Again, Nilsson takes evolution as his starting point. He presupposes evolution is true and works from there. Nowhere does Nilsson demonstrate that the evolution of the eye is likely or even could have occurred.


Nilsson has been doing this for years, going back to his 1994 paper, “A pessimistic estimate of the time required for an eye to evolve,” in Proceedings of the Royal Society B

Not Whether, but How Fast 

That 1994 paper explicitly stated (in the first paragraph) that the question is no longer whether the eye evolved, but how fast it evolved. Nonetheless, the paper was heavily promoted (and mischaracterized) by evolution promoter Richard Dawkins. For years after that, the paper was falsely cited as proof that the eye evolved, no question about it. If you like videos, Nilsson reviews his work in this 2019 presentation: 

Nilsson does very little original biology work. Instead, he offers evolutionary just-so stories. His work is something of a poster child for this false citation pseudoscience problem. The new Thornton paper is yet another example of how pervasive the problem is, and how vacuous is evolutionary science.


The formula goes like this: 1. Evolution is true. 2. Here’s how it must have happened. 3. Look, yet more proof of evolution.


This post is adapted from Dr. Hunter’s comments on Twitter.

 

 

The design filter can spot a dirty game?

Did Chess Ace Hans Niemann Cheat? A Design Detection Poser 

Evolution News @DiscoveryCSC 

On a new episode of ID the Future, mathematician William Dembski and host Eric Anderson explore whether design detection tools shed any light on the recent chess scandal involving world chess champion Magnus Carlsen and American grandmaster Hans Moke Niemann. Did Niemann cheat in a match where he beat Carlson, as some have claimed? There is no smoking gun in the case, so how might one determine if cheating occurred? At first glance the problem might seem far removed from the design detecting rules and tools Dembski laid out in his Cambridge University Press monograph The Design Inference. But actually there is some intriguing overlap. Is there a way to dig into the chess data and determine whether Niemann secretly used a computer chess engine to help him win the match? Tune in as Dembski and Anderson wrestle with the problem. Download the podcast or listen to it here. 



 

1914 : a marked year. II

Legacy of World War I  

BY HISTORY.COM EDITORS 

World War I Begins 

Convinced that Austria-Hungary was readying for war, the Serbian government ordered the Serbian army to mobilize and appealed to Russia for assistance. On July 28, Austria-Hungary declared war on Serbia, and the tenuous peace between Europe’s great powers quickly collapsed.


Within a week, Russia, Belgium, France, Great Britain and Serbia had lined up against Austria-Hungary and Germany, and World War I had begun. 

Legacy of World War I 

World War I brought about massive social upheaval, as millions of women entered the workforce to replace men who went to war and those who never came back. The first global war also helped to spread one of the world’s deadliest global pandemics, the Spanish flu epidemic of 1918, which killed an estimated 20 to 50 million people.


World War I has also been referred to as “the first modern war.” Many of the technologies now associated with military conflict—machine guns, tanks, aerial combat and radio communications—were introduced on a massive scale during World War I.


The severe effects that chemical weapons such as mustard gas and phosgene had on soldiers and civilians during World War I galvanized public and military attitudes against their continued use. The Geneva Convention agreements, signed in 1925, restricted the use of chemical and biological agents in warfare and remains in effect today.


Friday 11 November 2022

An exclusive category

Because my thirst for self flagellation apparently knows no bounds. I've been looking at what Christendom's theologians call the threeness oneness problem of the trinity, i.e how can God be three and yet one? The usual fudge is to state that term 'God' as applied to each of the three persons subsisting within the shared essence is an adjective and not a count noun but that the God in which they all simultaneously subsist and with whom/what(?) they are supposedly numerically identical is indeed a concrete reality. I reject the Characterisation of the issue. 

The issue is one of identity not primarily arithmetic

 according to the scripture. There is one God who is entitled to exclusive Devotion. 

Deuteronomy5:6,7ASV"6I am JEHOVAH thy God, who brought thee out of the land of Egypt, out of the house of bondage.


7Thou shalt have no other gods before me." 

Psalm83:18ASV"18That they may know that thou alone, whose name is Jehovah, Art the Most High over all the earth."

Thus there is a person who alone is entitled to our absolute devotion to the exclusion of all others i.e anyone/anything not identical to said person. The issue then is who is this person. Once we have identified this person all others would be excluded from the category of most high God by definition. The scriptures make it clear that this one is both a God and the God thus one cannot be identical to this one and not be a God as Christendom's theologians senselessly claim about the members of their triune God. In scripture ONLY the God of Jesus Christ i.e JEHOVAH is ever referred to as the God without qualification. And is definitely a God.

Deuteronomy5:9SSV"9thou shalt not bow down thyself unto them, nor serve them; for I, JEHOVAH, thy God, am A jealous God," 

Thus the claim that the God and Father of Jesus Christ is not a God in his own right is Falsified. Indeed he is the only God entitled to our absolute devotion.


J Robert Oppenheimer: a brief history.

 J. Robert Oppenheimer 

[note 1] (/ˈɒpənˌhaɪmər/; April 22, 1904 – February 18, 1967) was an American theoretical physicist. A professor of physics at the University of California, Berkeley, Oppenheimer was the wartime head of the Los Alamos Laboratory and is often credited as the "father of the atomic bomb" for his role in the Manhattan Project – the World War II undertaking that developed the first nuclear weapons. Oppenheimer was among those who observed the Trinity test in New Mexico, where the first atomic bomb was successfully detonated on July 16, 1945. He later remarked that the explosion brought to mind words from the Bhagavad Gita: "Now I am become Death, the destroyer of worlds."[2][note 2] In August 1945, the weapons were used in the atomic bombings of Hiroshima and Nagasaki. 

After the war ended, Oppenheimer became chairman of the influential General Advisory Committee of the newly created United States Atomic Energy Commission. He used that position to lobby for international control of nuclear power to avert nuclear proliferation and a nuclear arms race with the Soviet Union. He opposed the development of the hydrogen bomb during a 1949–1950 governmental debate on the question and subsequently took stances on defense-related issues that provoked the ire of some factions in the U.S. government and military. During the Second Red Scare, those stances, together with past associations Oppenheimer had with people and organizations affiliated with the Communist Party, led to him suffering the revocation of his security clearance in a much-written-about hearing in 1954. Effectively stripped of his direct political influence, he continued to lecture, write, and work in physics. Nine years later, President John F. Kennedy awarded (and Lyndon B. Johnson presented) him with the Enrico Fermi Award as a gesture of political rehabilitation.


Oppenheimer's achievements in physics included the Born–Oppenheimer approximation for molecular wave functions, work on the theory of electrons and positrons, the Oppenheimer–Phillips process in nuclear fusion, and the first prediction of quantum tunneling. With his students he also made important contributions to the modern theory of neutron stars and black holes, as well as to quantum mechanics, quantum field theory, and the interactions of cosmic rays. As a teacher and promoter of science, he is remembered as a founding father of the American school of theoretical physics that gained world prominence in the 1930s. After World War II, he became director of the Institute for Advanced Study in Princeton, New Jersey. 

Childhood and education

J. Robert Oppenheimer was born in New York City on April 22, 1904,[note 1][7] to Ella (née Friedman), a painter, and Julius Seligmann Oppenheimer, a wealthy textile importer. Born in Hanau, Hesse-Nassau, Prussia, Germany, Julius came to the United States as a teenager in 1888 with few resources, no money, no baccalaureate studies, and no knowledge of the English language. He was hired by a textile company and within a decade was an executive there, eventually becoming wealthy.[8] The Oppenheimers were both secular Ashkenazi Jews; his father was German Jewish, and his mother, who was from New York, descended from a German Jewish family that had lived in the U.S. since the 1840s.[9] In 1912, the family moved to an apartment on the 11th floor of 155 Riverside Drive, near West 88th Street, Manhattan, an area known for luxurious mansions and townhouses.[7] Their art collection included works by Pablo Picasso and Édouard Vuillard, and at least three original paintings by Vincent van Gogh.[10] Robert had a younger brother, Frank, who also became a physicist.[11]


Oppenheimer was initially educated at Alcuin Preparatory School; in 1911, he entered the Ethical Culture Society School.[12] This had been founded by Felix Adler to promote a form of ethical training based on the Ethical Culture movement, whose motto was "Deed before Creed". His father had been a member of the Society for many years, serving on its board of trustees from 1907 to 1915.[13] Oppenheimer was a versatile scholar, interested in English and French literature, and particularly in mineralogy.[14] He completed the third and fourth grades in one year and skipped half of the eighth grade.[12] During his final year, he became interested in chemistry.[15] He entered Harvard College one year after graduation, at age 18, because he suffered an attack of colitis while prospecting in Joachimstal during a family summer vacation in Europe. To help him recover from the illness, his father enlisted the help of his English teacher Herbert Smith who took him to New Mexico, where Oppenheimer fell in love with horseback riding and the southwestern United States.[16] 

Oppenheimer majored in chemistry, but Harvard required science students to also study history, literature, and philosophy or mathematics. He compensated for his late start by taking six courses each term and was admitted to the undergraduate honor society Phi Beta Kappa. In his first year, he was admitted to graduate standing in physics on the basis of independent study, which meant he was not required to take the basic classes and could enroll instead in advanced ones. He was attracted to experimental physics by a course on thermodynamics that was taught by Percy Bridgman. He graduated summa cum laude in three years.[17] 

Studies in Europe 

In 1924, Oppenheimer was informed that he had been accepted into Christ's College, Cambridge. He wrote to Ernest Rutherford requesting permission to work at the Cavendish Laboratory. Bridgman provided Oppenheimer with a recommendation, which conceded that Oppenheimer's clumsiness in the laboratory made it apparent his forte was not experimental but rather theoretical physics. Rutherford was unimpressed, but Oppenheimer went to Cambridge in the hope of landing another offer.[18] He was ultimately accepted by J. J. Thomson on condition that he complete a basic laboratory course.[19] He developed an antagonistic relationship with his tutor, Patrick Blackett, who was only a few years his senior. While on vacation, as recalled by his friend Francis Fergusson, Oppenheimer once confessed that he had left an apple doused with noxious chemicals on Blackett's desk. While Fergusson's account is the only detailed version of this event, Oppenheimer's parents were alerted by the university authorities who considered placing him on probation, a fate prevented by his parents successfully lobbying the authorities.[20]


Oppenheimer was a tall, thin chain smoker,[21] who often neglected to eat during periods of intense thought and concentration. Many of his friends described him as having self-destructive tendencies. A disturbing event occurred when he took a vacation from his studies in Cambridge to meet up with Fergusson in Paris. Fergusson noticed that Oppenheimer was not well. To help distract him from his depression, Fergusson told Oppenheimer that he (Fergusson) was to marry his girlfriend Frances Keeley. Oppenheimer did not take the news well. He jumped on Fergusson and tried to strangle him. Although Fergusson easily fended off the attack, the episode convinced him of Oppenheimer's deep psychological troubles. Throughout his life, Oppenheimer was plagued by periods of depression,[22][23] and he once told his brother, "I need physics more than friends".[24]


In 1926, Oppenheimer left Cambridge for the University of Göttingen to study under Max Born. Göttingen was one of the world's leading centers for theoretical physics. Oppenheimer made friends who went on to great success, including Werner Heisenberg, Pascual Jordan, Wolfgang Pauli, Paul Dirac, Enrico Fermi and Edward Teller. He was known for being too enthusiastic in discussion, sometimes to the point of taking over seminar sessions.[25] This irritated some of Born's other students so much that Maria Goeppert presented Born with a petition signed by herself and others threatening a boycott of the class unless he made Oppenheimer quiet down. Born left it out on his desk where Oppenheimer could read it, and it was effective without a word being said.[26]


He obtained his Doctor of Philosophy degree in March 1927 at age 23, supervised by Born.[27] After the oral exam, James Franck, the professor administering, reportedly said, "I'm glad that's over. He was on the point of questioning me."[4] Oppenheimer published more than a dozen papers at Göttingen, including many important contributions to the new field of quantum mechanics. He and Born published a famous paper on the Born–Oppenheimer approximation, which separates nuclear motion from electronic motion in the mathematical treatment of molecules, allowing nuclear motion to be neglected to simplify calculations. It remains his most cited work.[28] 

Early professional work 

Educational work

Oppenheimer was awarded a United States National Research Council fellowship to the California Institute of Technology (Caltech) in September 1927. Bridgman also wanted him at Harvard, so a compromise was reached whereby he split his fellowship for the 1927–28 academic year between Harvard in 1927 and Caltech in 1928.[29] At Caltech he struck up a close friendship with Linus Pauling, and they planned to mount a joint attack on the nature of the chemical bond, a field in which Pauling was a pioneer, with Oppenheimer supplying the mathematics and Pauling interpreting the results. Both the collaboration and their friendship ended when Pauling began to suspect Oppenheimer of becoming too close to his wife, Ava Helen Pauling. Once, when Pauling was at work, Oppenheimer had arrived at their home and invited Ava Helen to join him on a tryst in Mexico. Though she refused and reported the incident to her husband,[30] the invitation, and her apparent nonchalance about it, disquieted Pauling and he ended his relationship with Oppenheimer. Oppenheimer later invited him to become head of the Chemistry Division of the Manhattan Project, but Pauling refused, saying he was a pacifist.[31]


In the autumn of 1928, Oppenheimer visited Paul Ehrenfest's institute at the University of Leiden, the Netherlands, where he impressed by giving lectures in Dutch, despite having little experience with the language. There he was given the nickname of Opje,[32] later anglicized by his students as "Oppie".[33] From Leiden he continued on to the Swiss Federal Institute of Technology (ETH) in Zurich to work with Wolfgang Pauli on quantum mechanics and the continuous spectrum. Oppenheimer respected and liked Pauli and may have emulated his personal style as well as his critical approach to problems.[34] 

On returning to the United States, Oppenheimer accepted an associate professorship from the University of California, Berkeley, where Raymond T. Birge wanted him so badly that he expressed a willingness to share him with Caltech.[31]


Before he began his Berkeley professorship, Oppenheimer was diagnosed with a mild case of tuberculosis and spent some weeks with his brother Frank at a New Mexico ranch, which he leased and eventually purchased. When he heard the ranch was available for lease, he exclaimed, "Hot dog!", and later called it Perro Caliente, literally "hot dog" in Spanish.[35] Later he used to say that "physics and desert country" were his "two great loves".[36] He recovered from tuberculosis and returned to Berkeley, where he prospered as an advisor and collaborator to a generation of physicists who admired him for his intellectual virtuosity and broad interests. His students and colleagues saw him as mesmerizing: hypnotic in private interaction, but often frigid in more public settings. His associates fell into two camps: one that saw him as an aloof and impressive genius and aesthete, the other that saw him as a pretentious and insecure poseur.[37] His students almost always fell into the former category, adopting his walk, speech, and other mannerisms, and even his inclination for reading entire texts in their original languages.[38] Hans Bethe said of him: 

Probably the most important ingredient he brought to his teaching was his exquisite taste. He always knew what were the important problems, as shown by his choice of subjects. He truly lived with those problems, struggling for a solution, and he communicated his concern to the group. In its heyday, there were about eight or ten graduate students in his group and about six Post-doctoral Fellows. He met this group once a day in his office and discussed with one after another the status of the student's research problem. He was interested in everything, and in one afternoon they might discuss quantum electrodynamics, cosmic rays, electron pair production and nuclear physics.[39] 

He worked closely with Nobel Prize-winning experimental physicist Ernest O. Lawrence and his cyclotron pioneers, helping them understand the data their machines were producing at the Lawrence Berkeley National Laboratory.[40] In 1936, Berkeley promoted him to full professor at a salary of $3,300 a year (equivalent to $64,000 in 2021). In return he was asked to curtail his teaching at Caltech, so a compromise was reached whereby Berkeley released him for six weeks each year, enough to teach one term at Caltech.[41] 

Scientific work 

Oppenheimer did important research in theoretical astronomy (especially as related to general relativity and nuclear theory), nuclear physics, spectroscopy, and quantum field theory, including its extension into quantum electrodynamics. The formal mathematics of relativistic quantum mechanics also attracted his attention, although he doubted its validity. His work predicted many later finds, which include the neutron, meson and neutron star.[42]


Initially, his major interest was the theory of the continuous spectrum and his first published paper, in 1926, concerned the quantum theory of molecular band spectra. He developed a method to carry out calculations of its transition probabilities. He calculated the photoelectric effect for hydrogen and X-rays, obtaining the absorption coefficient at the K-edge. His calculations accorded with observations of the X-ray absorption of the sun, but not helium. Years later it was realized that the sun was largely composed of hydrogen and that his calculations were indeed correct.[43][44] 

Oppenheimer also made important contributions to the theory of cosmic ray showers and started work that eventually led to descriptions of quantum tunneling. In 1931, he co-wrote a paper on the "Relativistic Theory of the Photoelectric Effect" with his student Harvey Hall,[45] in which, based on empirical evidence, he correctly disputed Dirac's assertion that two of the energy levels of the hydrogen atom have the same energy. Subsequently, one of his doctoral students, Willis Lamb, determined that this was a consequence of what became known as the Lamb shift, for which Lamb was awarded the Nobel Prize in physics in 1955.[42]


With his first doctoral student, Melba Phillips, Oppenheimer worked on calculations of artificial radioactivity under bombardment by deuterons. When Ernest Lawrence and Edwin McMillan bombarded nuclei with deuterons they found the results agreed closely with the predictions of George Gamow, but when higher energies and heavier nuclei were involved, the results did not conform to the theory. In 1935, Oppenheimer and Phillips worked out a theory—now known as the Oppenheimer–Phillips process—to explain the results; this theory is still in use today.[46]


As early as 1930, Oppenheimer wrote a paper that essentially predicted the existence of the positron. This was after a paper by Paul Dirac proposed that electrons could have both a positive charge and negative energy. Dirac's paper introduced an equation, known as the Dirac equation, which unified quantum mechanics, special relativity and the then-new concept of electron spin, to explain the Zeeman effect.[47] Oppenheimer, drawing on the body of experimental evidence, rejected the idea that the predicted positively charged electrons were protons. He argued that they would have to have the same mass as an electron, whereas experiments showed that protons were much heavier than electrons. Two years later, Carl David Anderson discovered the positron, for which he received the 1936 Nobel Prize in Physics.[48]


In the late 1930s, Oppenheimer became interested in astrophysics, most likely through his friendship with Richard Tolman, resulting in a series of papers. In the first of these, a 1938 paper co-written with Robert Serber entitled "On the Stability of Stellar Neutron Cores",[49] Oppenheimer explored the properties of white dwarfs. This was followed by a paper co-written with one of his students, George Volkoff, "On Massive Neutron Cores",[50] in which they demonstrated that there was a limit, the so-called Tolman–Oppenheimer–Volkoff limit, to the mass of stars beyond which they would not remain stable as neutron stars and would undergo gravitational collapse. Finally, in 1939, Oppenheimer and another of his students, Hartland Snyder, produced a paper "On Continued Gravitational Contraction",[51] which predicted the existence of what are today known as black holes. After the Born–Oppenheimer approximation paper, these papers remain his most cited, and were key factors in the rejuvenation of astrophysical research in the United States in the 1950s, mainly by John A. Wheeler.[52]


Oppenheimer's papers were considered difficult to understand even by the standards of the abstract topics he was expert in. He was fond of using elegant, if extremely complex, mathematical techniques to demonstrate physical principles, though he was sometimes criticized for making mathematical mistakes, presumably out of haste. "His physics was good", said his student Snyder, "but his arithmetic awful".[42]


After World War II, Oppenheimer published only five scientific papers, one of which was in biophysics, and none after 1950. Murray Gell-Mann, a later Nobelist who, as a visiting scientist, worked with him at the Institute for Advanced Study in 1951, offered this opinion: 

He didn't have Sitzfleisch, 'sitting flesh,' when you sit on a chair. As far as I know, he never wrote a long paper or did a long calculation, anything of that kind. He didn't have patience for that; his own work consisted of little aperçus, but quite brilliant ones. But he inspired other people to do things, and his influence was fantastic.[53] 

Oppenheimer's diverse interests sometimes interrupted his focus on science. He liked things that were difficult, and since much of the scientific work appeared easy for him, he developed an interest in the mystical and the cryptic. In 1933, he learned Sanskrit and met the Indologist Arthur W. Ryder at Berkeley. He eventually read the Bhagavad Gita and the Upanishads in the original Sanskrit, and deeply pondered over them. He later cited the Gita as one of the books that most shaped his philosophy of life.[54][55]


His close confidant and colleague, Nobel Prize winner Isidor Rabi, later gave his own interpretation: 

Oppenheimer was overeducated in those fields, which lie outside the scientific tradition, such as his interest in religion, in the Hindu religion in particular, which resulted in a feeling of mystery of the universe that surrounded him like a fog. He saw physics clearly, looking toward what had already been done, but at the border he tended to feel there was much more of the mysterious and novel than there actually was ... [he turned] away from the hard, crude methods of theoretical physics into a mystical realm of broad intuition.[56] 

In spite of this, observers such as Nobel Prize-winning physicist Luis Alvarez have suggested that if he had lived long enough to see his predictions substantiated by experiment, Oppenheimer might have won a Nobel Prize for his work on gravitational collapse, concerning neutron stars and black holes.[57][58] In retrospect, some physicists and historians consider this to be his most important contribution, though it was not taken up by other scientists in his own lifetime.[59] The physicist and historian Abraham Pais once asked Oppenheimer what he considered to be his most important scientific contributions; Oppenheimer cited his work on electrons and positrons, not his work on gravitational contraction.[60] Oppenheimer was nominated for the Nobel Prize for physics three times, in 1946, 1951 and 1967, but never won.[61][62] 

Los Alamos 

On October 9, 1941, two months before the United States entered World War II, President Franklin D. Roosevelt approved a crash program to develop an atomic bomb.[91] In May 1942, National Defense Research Committee Chairman James B. Conant, who had been one of Oppenheimer's lecturers at Harvard, invited Oppenheimer to take over work on fast neutron calculations, a task that Oppenheimer threw himself into with full vigor. He was given the title "Coordinator of Rapid Rupture", which specifically referred to the propagation of a fast neutron chain reaction in an atomic bomb. One of his first acts was to host a summer school for bomb theory at his building in Berkeley. The mix of European physicists and his own students—a group including Robert Serber, Emil Konopinski, Felix Bloch, Hans Bethe and Edward Teller—kept themselves busy by calculating what needed to be done, and in what order, to make the bomb.[92] 

In June 1942, the US Army established the Manhattan Project to handle its part in the atom bomb project and began the process of transferring responsibility from the Office of Scientific Research and Development to the military.[94] In September, Groves was appointed director of what became known as the Manhattan Project.[95] He selected Oppenheimer to head the project's secret weapons laboratory. This was a choice that surprised many because Oppenheimer had left-wing political views and no record as a leader of large projects. Groves was concerned by the fact that Oppenheimer did not have a Nobel Prize and might not have had the prestige to direct fellow scientists.[96] However, he was impressed by Oppenheimer's singular grasp of the practical aspects of designing and constructing an atomic bomb, and by the breadth of his knowledge. As a military engineer, Groves knew that this would be vital in an interdisciplinary project that would involve not just physics, but chemistry, metallurgy, ordnance and engineering. Groves also detected in Oppenheimer something that many others did not, an "overweening ambition" that Groves reckoned would supply the drive necessary to push the project to a successful conclusion. Isidor Rabi considered the appointment "a real stroke of genius on the part of General Groves, who was not generally considered to be a genius".[97]


Oppenheimer and Groves decided that for security and cohesion they needed a centralized, secret research laboratory in a remote location. Scouting for a site in late 1942, Oppenheimer was drawn to New Mexico, not far from his ranch. On November 16, 1942, Oppenheimer, Groves and others toured a prospective site. Oppenheimer feared that the high cliffs surrounding the site would make his people feel claustrophobic, while the engineers were concerned with the possibility of flooding. He then suggested and championed a site that he knew well: a flat mesa near Santa Fe, New Mexico, which was the site of a private boys' school called the Los Alamos Ranch School. The engineers were concerned about the poor access road and the water supply but otherwise felt that it was ideal.[98] The Los Alamos Laboratory was built on the site of the school, taking over some of its buildings, while many new buildings were erected in great haste. At the laboratory, Oppenheimer assembled a group of the top physicists of the time, which he referred to as the "luminaries".[99]


Los Alamos was initially supposed to be a military laboratory, and Oppenheimer and other researchers were to be commissioned into the Army. He went so far as to order himself a lieutenant colonel's uniform and take the Army physical test, which he failed. Army doctors considered him underweight at 128 pounds (58 kg), diagnosed his chronic cough as tuberculosis and were concerned about his chronic lumbosacral joint pain.[100] The plan to commission scientists fell through when Robert Bacher and Isidor Rabi balked at the idea. Conant, Groves, and Oppenheimer devised a compromise whereby the laboratory was operated by the University of California under contract to the War Department.[101] It soon turned out that Oppenheimer had hugely underestimated the magnitude of the project; Los Alamos grew from a few hundred people in 1943 to over 6,000 in 1945.[100]


Oppenheimer at first had difficulty with the organizational division of large groups, but rapidly learned the art of large-scale administration after he took up permanent residence on the mesa. He was noted for his mastery of all scientific aspects of the project and for his efforts to control the inevitable cultural conflicts between scientists and the military. He was an iconic figure to his fellow scientists, as much a symbol of what they were working toward as a scientific director. Victor Weisskopf put it thus: 

Oppenheimer directed these studies, theoretical and experimental, in the real sense of the words. Here his uncanny speed in grasping the main points of any subject was a decisive factor; he could acquaint himself with the essential details of every part of the work. He did not direct from the head office. He was intellectually and physically present at each decisive step. He was present in the laboratory or in the seminar rooms, when a new effect was measured, when a new idea was conceived. It was not that he contributed so many ideas or suggestions; he did so sometimes, but his main influence came from something else. It was his continuous and intense presence, which produced a sense of direct participation in all of us; it created that unique atmosphere of enthusiasm and challenge that pervaded the place throughout its time.[102] 

At this point in the war, there was considerable anxiety among the scientists that the Germans might be making faster progress on an atomic weapon than they were.[103][104] In a letter dated May 25, 1943, Oppenheimer responded to a proposal from Fermi to use radioactive materials to poison German food supplies. Oppenheimer asked Fermi whether he could produce enough strontium without letting too many in on the secret. Oppenheimer continued, "I think we should not attempt a plan unless we can poison food sufficient to kill a half a million men."[105] 

In 1943 development efforts were directed to a plutonium gun-type fission weapon called "Thin Man". Initial research on the properties of plutonium was done using cyclotron-generated plutonium-239, which was extremely pure but could only be created in tiny amounts. When Los Alamos received the first sample of plutonium from the X-10 Graphite Reactor in April 1944 a problem was discovered: reactor-bred plutonium had a higher concentration of plutonium-240, making it unsuitable for use in a gun-type weapon.[106] In July 1944, Oppenheimer abandoned the gun design in favor of an implosion-type weapon. Using chemical explosive lenses, a sub-critical sphere of fissile material could be squeezed into a smaller and denser form. The metal needed to travel only very short distances, so the critical mass would be assembled in much less time.[107] In August 1944 Oppenheimer implemented a sweeping reorganization of the Los Alamos laboratory to focus on implosion.[108] He concentrated the development efforts on the gun-type device, a simpler design that only had to work with uranium-235, in a single group, and this device became Little Boy in February 1945.[109] After a mammoth research effort, the more complex design of the implosion device, known as the "Christy gadget" after Robert Christy, another student of Oppenheimer's,[110] was finalized in a meeting in Oppenheimer's office on February 28, 1945.[111]


In May 1945 an Interim Committee was created to advise and report on wartime and postwar policies regarding the use of nuclear energy. The Interim Committee in turn established a scientific panel consisting of Arthur Compton, Fermi, Lawrence and Oppenheimer to advise it on scientific issues. In its presentation to the Interim Committee, the scientific panel offered its opinion not just on the likely physical effects of an atomic bomb, but on its likely military and political impact.[112] This included opinions on such sensitive issues as whether or not the Soviet Union should be advised of the weapon in advance of its use against Japan.[113] 

Trinity 

The joint work of the scientists at Los Alamos resulted in the world's first nuclear explosion, near Alamogordo, New Mexico, on July 16, 1945. Oppenheimer had given the site the codename "Trinity" in mid-1944 and said later that it was from one of John Donne's Holy Sonnets. According to the historian Gregg Herken, this naming could have been an allusion to Jean Tatlock, who had committed suicide a few months previously and had in the 1930s introduced Oppenheimer to Donne's work.[115]


Oppenheimer later recalled that, while witnessing the explosion, he thought of a verse from the Bhagavad Gita (XI,12): divi sūrya-sahasrasya bhaved yugapad utthitā yadi bhāḥ sadṛṥī sā syād bhāsas tasya mahātmanaḥ[116] 

If the radiance of a thousand suns were to burst at once into the sky, that would be like the splendor of the mighty one ...[5][117] 

Years later he would explain that another verse had also entered his head at that time: namely, the famous verse: "kālo'smi lokakṣayakṛtpravṛddho lokānsamāhartumiha pravṛttaḥ" (XI,32),[118] which he translated as "I am become Death, the destroyer of worlds."[note 2]


In 1965, when he was persuaded to quote again for a television broadcast, he said: 

We knew the world would not be the same. A few people laughed, a few people cried. Most people were silent. I remembered the line from the Hindu scripture, the Bhagavad Gita; Vishnu is trying to persuade the Prince that he should do his duty and, to impress him, takes on his multi-armed form and says, 'Now I am become Death, the destroyer of worlds.' I suppose we all thought that, one way or another.[3] 

Among those present with Oppenheimer in the control bunker at the site were his brother Frank and Brigadier General Thomas Farrell. When Jeremy Bernstein asked Frank what Robert's first words after the test had been, the answer was "I guess it worked."[119] Farrell summarized Robert's reaction as follows: 

Dr. Oppenheimer, on whom had rested a very heavy burden, grew tenser as the last seconds ticked off. He scarcely breathed. He held on to a post to steady himself. For the last few seconds, he stared directly ahead and then when the announcer shouted "Now!" and there came this tremendous burst of light followed shortly thereafter by the deep growling roar of the explosion, his face relaxed into an expression of tremendous relief.[120] 

Physicist Isidor Rabi noticed Oppenheimer's disconcerting triumphalism: "I'll never forget his walk; I'll never forget the way he stepped out of the car ... his walk was like High Noon ... this kind of strut. He had done it."[121] At an assembly at Los Alamos on August 6 (the evening of the atomic bombing of Hiroshima), Oppenheimer took to the stage and clasped his hands together "like a prize-winning boxer" while the crowd cheered. He noted his regret the weapon had not been available in time to use against Nazi Germany.[122] However, he and many of the project staff were very upset about the bombing of Nagasaki, as they did not feel the second bomb was necessary from a military point of view.[123] He traveled to Washington on August 17 to hand-deliver a letter to Secretary of War Henry L. Stimson expressing his revulsion and his wish to see nuclear weapons banned.[124] In October 1945 Oppenheimer was granted an interview with President Harry S. Truman. The meeting, however, went badly, after Oppenheimer remarked he felt he had "blood on my hands". The remark infuriated Truman and put an end to the meeting. Truman later told his Undersecretary of State Dean Acheson "I don't want to see that son-of-a-bitch in this office ever again."[125]


For his services as director of Los Alamos, Oppenheimer was awarded the Medal for Merit from President Harry S. Truman in 1946.[126] 

Final years and death 

The frontiers of science are separated now by long years of study, by specialized vocabularies, arts, techniques, and knowledge from the common heritage even of a most civilized society; and anyone working at the frontier of such science is in that sense a very long way from home, a long way too from the practical arts that were its matrix and origin, as indeed they were of what we today call art.


Robert Oppenheimer, "Prospects in the Arts and Sciences" in Man's Right to Knowledge[220] 

Starting in 1954, Oppenheimer lived for several months of the year on the island of Saint John in the U.S. Virgin Islands. In 1957, he purchased a 2-acre (0.81 ha) tract of land on Gibney Beach, where he built a spartan home on the beach.[221] He spent a considerable amount of time sailing with his daughter Toni and wife Kitty.[222]


Oppenheimer's first public appearance following the stripping of his security clearance was a lecture titled "Prospects in the Arts and Sciences" for the Columbia University Bicentennial radio show Man's Right to Knowledge, in which he outlined his philosophy and his thoughts on the role of science in the modern world.[223][224] He had been selected for the final episode of the lecture series two years prior to the security hearing, though the university remained adamant that he stay on even after the controversy.[225]


In February 1955, the president of the University of Washington, Henry Schmitz, abruptly cancelled an invitation to Oppenheimer to deliver a series of lectures there. Schmitz's decision caused an uproar among the students; 1,200 of them signed a petition protesting the decision, and Schmitz was burned in effigy. While they marched in protest, the state of Washington outlawed the Communist Party, and required all government employees to swear a loyalty oath. Edwin Albrecht Uehling, the chairman of the physics department and a colleague of Oppenheimer's from Berkeley, appealed to the university senate, and Schmitz's decision was overturned by a vote of 56-40. Oppenheimer stopped briefly in Seattle to change planes on a trip to Oregon, and was joined for coffee during his layover by several University of Washington faculty, but Oppenheimer never lectured there.[226][227] 

Oppenheimer was increasingly concerned about the potential danger that scientific inventions could pose to humanity. He joined with Albert Einstein, Bertrand Russell, Joseph Rotblat and other eminent scientists and academics to establish what would eventually, in 1960, become the World Academy of Art and Science. Significantly, after his public humiliation, he did not sign the major open protests against nuclear weapons of the 1950s, including the Russell–Einstein Manifesto of 1955, nor, though invited, did he attend the first Pugwash Conferences on Science and World Affairs in 1957.[228]


In his speeches and public writings, Oppenheimer continually stressed the difficulty of managing the power of knowledge in a world in which the freedom of science to exchange ideas was more and more hobbled by political concerns. Oppenheimer delivered the Reith Lectures on the BBC in 1953, which were subsequently published as Science and the Common Understanding.[229] In 1955 Oppenheimer published The Open Mind, a collection of eight lectures that he had given since 1946 on the subject of nuclear weapons and popular culture. Oppenheimer rejected the idea of nuclear gunboat diplomacy. "The purposes of this country in the field of foreign policy", he wrote, "cannot in any real or enduring way be achieved by coercion". In 1957 the philosophy and psychology departments at Harvard invited Oppenheimer to deliver the William James Lectures. An influential group of Harvard alumni led by Edwin Ginn that included Archibald Roosevelt protested against the decision.[230] Some 1,200 people packed into Sanders Theatre to hear Oppenheimer's six lectures, entitled "The Hope of Order".[228] Oppenheimer delivered the Whidden Lectures at McMaster University in 1962, and these were published in 1964 as The Flying Trapeze: Three Crises for Physicists.[231]Deprived of political power, Oppenheimer continued to lecture, write and work on physics. He toured Europe and Japan, giving talks about the history of science, the role of science in society, and the nature of the universe.[232] In September 1957, France made him an Officer of the Legion of Honor,[233] and on May 3, 1962, he was elected a Foreign Member of the Royal Society in Britain.[234][235] At the urging of many of Oppenheimer's political friends who had ascended to power, President John F. Kennedy awarded Oppenheimer the Enrico Fermi Award in 1963 as a gesture of political rehabilitation. Edward Teller, the winner of the previous year's award, had also recommended Oppenheimer receive it, in the hope that it would heal the rift between them.[236] A little over a week after Kennedy's assassination, his successor, President Lyndon Johnson, presented Oppenheimer with the award, "for contributions to theoretical physics as a teacher and originator of ideas, and for leadership of the Los Alamos Laboratory and the atomic energy program during critical years".[237] Oppenheimer told Johnson: "I think it is just possible, Mr. President, that it has taken some charity and some courage for you to make this award today."[238]


The rehabilitation implied by the award was partly symbolic, as Oppenheimer still lacked a security clearance and could have no effect on official policy, but the award came with a $50,000 tax-free stipend, and its award outraged many prominent Republicans in Congress. The late President Kennedy's widow Jacqueline, still living in the White House, made it a point to meet with Oppenheimer to tell him how much her husband had wanted him to have the medal.[239] While still a senator in 1959, Kennedy had been instrumental in voting to narrowly deny Oppenheimer's enemy Lewis Strauss a coveted government position as Secretary of Commerce, effectively ending Strauss's political career. This was partly due to lobbying by the scientific community on behalf of Oppenheimer.[240] 

Oppenheimer was a chain smoker who was diagnosed with throat cancer in late 1965. After inconclusive surgery, he underwent unsuccessful radiation treatment and chemotherapy late in 1966.[241] He fell into a coma on February 15, 1967, and died at his home in Princeton, New Jersey, on February 18, aged 62. A memorial service was held a week later at Alexander Hall on the campus of Princeton University. The service was attended by 600 of his scientific, political and military associates that included Bethe, Groves, Kennan, Lilienthal, Rabi, Smyth and Wigner. His brother Frank and the rest of his family were also there, as was the historian Arthur M. Schlesinger, Jr., the novelist John O'Hara, and George Balanchine, the director of the New York City Ballet. Bethe, Kennan and Smyth gave brief eulogies.[242] Oppenheimer's body was cremated and his ashes were placed into an urn. His wife Kitty took the ashes to St. John and dropped the urn into the sea, within sight of the beach house.[243]


In October 1972, Kitty died aged 62 from an intestinal infection that was complicated by a pulmonary embolism. Oppenheimer's ranch in New Mexico was then inherited by their son Peter, and the beach property was inherited by their daughter Katherine "Toni" Oppenheimer Silber. Toni was refused security clearance for her chosen vocation as a United Nations translator after the FBI brought up the old charges against her father. In January 1977 (three months after the end of her second marriage), she committed suicide aged 32; her ex-husband found her hanging from a beam in her family beach house.[244] She left the property to "the people of St. John for a public park and recreation area".[245] The original house was built too close to the coast and succumbed to a hurricane. Today the Virgin Islands Government maintains a Community Center in the area.[246]

Alas for Darwinism: the fossil record's gonna fossil record. II

 Fossil Friday: The Complex Wing Folding of Earwigs 

Günter Bechly 

Today’s featured fossil, an earwig, is the paratype specimen of Cratoborellia gorbi, which I found and photographed at a German trader’s collection in July 2006, where I also discovered the holotype that is deposited in the collection of the Stuttgart Natural History Museum and was described by my fellow student Fabian Haas (Haas 2007). The fossil belongs to the living earwig family Anisolabididae and is three-dimensionally preserved as iron oxide-hydroxide (Goethite) in the Lower Cretaceous (115 million years old) laminated limestone of the Crato Formation from northeast Brazil. It is one of the very few fossil earwig specimens with spread hind wing, and documents a very similar pattern of wing folding to its living relatives.


Lay people may hardly be aware that many earwigs indeed have wings and can fly, as they only rarely do. However, they not only do possess wings, but also have very sophisticated adaptations in their construction. Just like beetles, they have hard forewings that serve as protective flaps (elytrae), while the hind wings fold in a complex way beneath the forewings (they even use their pincers to assist in the folding of the wings). 

Another Example of Convergence 

This is another example of striking convergence in the animal kingdom. These convergent adaptations can be traced back to the earliest known putative stem earwigs (Protelytroptera) from the Permian period about 299-252 million years ago (Haas & Kukalová-Peck 2001, Bethoux et al. 2016). Earwig wings not only fold like a fan in longitudinal direction, but additionally along a row of flexible patches in a transverse direction (Haas et al. 2000). This kind of natural origami is stunning and beautifully illustrated in a YouTube video by the ETH Zurich University (below), where researchers copied this design principle for biomimetic technology that could be used for foldable solar sails in space. 

This highly complex mode of wing folding is one of the many examples of engineering marvels in insects that strongly suggest intelligent design as superior explanation to blind evolution. 

References 

Bethoux O, Llamosi A & Toussaint S 2016. Reinvestigation of Protelytron permianum (Insecta; Early Permian; USA) as an example for applying reflectance transformation imaging to insect imprint fossils. Fossil Record 20, 1–7. DOI: https://doi.org/10.5194/fr-20-1-2016.

Haas F 2007. Dermaptera: earwigs. Chapter 11.6, pp. 222–234 in: Martill DM, Bechly G & Loveridge RF (eds). The Crato Fossil Beds of Brazil. Cambridge University Press, Cambridge (UK), xvi+625 pp.

Haas F, Gorb SN & Wootton RJ 2000. Elastic joints in dermapteran hind wings: materials and wing folding. Arthropod Structure and Development 29(2), 137–146. DOI: https://doi.org/10.1016/S1467-8039(00)00025-6.

Haas F & Kukalová-Peck J 2001. Dermaptera hindwing structure and folding, new evidence for superordinal relationship within Neoptera (Insecta). European Journal of Entomology98(4), 445–509. DOI: https://doi.org/10.14411/eje.2001.065.



Whither the bright line between artificial and natural causation?

More Unnatural Naturalism, and More Confusion from Naturalists 

David Coppedge 

Yesterday I commented on the conundrums created for evolutionists by engineering. Once you start looking, you’ll frequently see the problem facing naturalists about natural and unnatural causes. Writing in City Journal, for example, science reporter Nicholas Wade assumed that “natural” causes could be distinguished from “manipulated” actions in the case of the origin of SARS-Coronavirus-2: 

Two hypotheses have long been on the table. One is that the virus jumped naturally from some animal host, as many epidemics have done in the past. The other is that it escaped from a lab in Wuhan, where researchers are known to have been genetically manipulating bat viruses in order to predict future epidemics. Both hypotheses are plausible but, so far, no direct evidence exists for either. 

News from CORDIS via Phys.org again illustrates the distinction between natural activities of humans and their intentional, purposeful designs. The article, “When did humans start using roads?”, says this: 

But when did humans actually begin to use roads? “The generic and honest answer is that it’s really hard to know,” says Kalayci. “First, we have to be very clear in our mind what we mean by ‘road’ — are we talking about an engineered road, or a simple dirt track that has naturally formed by people and/or animals constantly walking along the same line?”


In the case of the latter, one can argue, rather philosophically, that as soon as humans learnt to walk and began to traverse the world from their African homelands, roads began to form — in short, a road can be conceived as merely a line that humans continuously wander along.


But Kalayci informs us that it was probably the ancient Egyptians that purposely went out of their way to build the first paved roads, when they were busy building pyramids and other monuments, sometime between 2600 and 2200 BCE, during the Old Kingdom Period. “They essentially wanted a nice, easy, straight route between the monument site and quarry that allowed materials to be transported quickly and efficiently,” he explains. 

Hikers know that animals like bighorn sheep consistently re-use paths in their natural habitats. This quote, though, shows something different about humans. They “purposely” sometimes go “out of their way” to build monuments that are not essential to mere survival, and think about ways to move materials “quickly and efficiently.” They employ mathematics to build geometric objects for purposes that they believe transcend physical existence. 

“Natural” Organisms Are Oblivious to Human Design 

"If the art of ship-building were in the wood,” Aristotle recognized, “ships would exist by nature.” We humans know the intelligent causation, foresight, and intentionality required to build a floating craft able to carry cargo that left to its natural state would sink to the bottom of the sea. Flotsam can drift by nature, but something other than nature is required to design something capable of navigating a chosen course against natural wind and waves, using manufactured sails and oars. 


Ships can, however, sink “by nature” (e.g., due to storms, accidents, entropy). Now “millions of shipwrecks in the world’s oceans, each providing a potentially new habitat for sea life,” states a news item at Frontiers in Science. The bacteria and fish that find habitats in shipwrecks don’t care. They treat them like other “natural” habitats. Only humans know or care. 

Wooden shipwrecks provide microbial habitats similar to naturally occurring geological seabed structures, reports a new study in Frontiers in Marine Science…. Microbes are at the base of ocean food chains, and this is among the first research to show the impact of human activities–like shipwrecks–on these environments.


“Microbial communities are important to be aware of and understand because they provide early and clear evidence of how human activities change life in the ocean,” said corresponding author Dr Leila Hamdan of the University of Southern Mississippi.


“Ocean scientists have known that natural hard habitats, some of which have been present for hundreds to thousands of years shape the biodiversity of life on the seafloor. This work is the first to show that built habitats (places or things made or modified by humans) impact the films of microbes (biofilms) coating these surfaces as well. These biofilms are ultimately what enable hard habitats to transform into islands of biodiversity.” 

Is Animal Engineering the Same as Human Engineering? 

To round out this discussion of natural versus unnatural causes, we need to investigate how reporters treat cases of animal engineering. For example, the journal Nature discussed “how bees achieve an engineering marvel: the honeycomb.” In a similar vein, news from Texas A&M tells about research “Determining how and why cells make decisions.” Isn’t decision-making a mental, purposeful activity? Isn’t engineering a honeycomb an example of intentional work for a purpose?


Well, yes and no. The answers can be elucidated with another question: is there a distinction between a software programmer and the program he or she designed? Honeybees and cells have a limited set of options that are programmed into their genomes. It could be considered “unnatural” for a honeybee to gather ingredients and build hexagons in which the queen’s eggs can be nourished. Rock and soil would never do that. The bee must apply directed work against entropy to pull it off. The cells in an embryo “make decisions” based on pre-programmed responses to signals. These can be considered “natural” activities in the same way a robot on a car assembly line is performing the “natural” function it was designed to do. 


Human beings, by contrast, have free will to think, decide, and design things that may have no survival function at all, such as art and literature. As C. S. Lewis said: 

The Naturalists have been engaged in thinking about Nature. They have not attended to the fact that they were thinking. The moment one attends to this it is obvious that one’s own thinking cannot be merely a natural event, and that therefore something other than Nature exists.  

We can decide to do something, or decide not to do it. We can choose between limitless options. Thoughts are what make human beings unnatural. Thoughts are what make us exceptional. 

 

Thursday 10 November 2022

Darwinism is not settled science?

 There Is No Settled “Theory of Evolution”

Cornelius Hunter 

What is evolution? The origin of species by: natural selection, random causes, common descent, gradualism, etc. Right?


Wrong. Too often that is what is taught, but it is false. That’s according to evolutionists themselves. A typical example? See, “The study of evolution is fracturing — and that may be a good thing,” by Lund University biologist Erik Svensson, writing at The Conversation.


Evolutionists themselves can forfeit natural selection, random causes, common descent, etc. How do I know? Because it is in the literature. 


So, what is evolution? In other words, what is core to the theory — and not forfeitable? It’s naturalism. Period. That is the only thing required of evolutionary theory. And naturalism is a religious requirement, not a scientific one.


Aside from naturalism, practically anything is fair game: Uncanny convergence, rapid divergence, lineage-specific biology, evolution of evolution, directed mutations, saltationism, unlikely simultaneous mutations, just-so stories, multiverses … the list goes on.


But this is where it gets interesting. Because if you have two theories, you don’t have one theory. In other words, you have a multitude of contradictory theories. And you have heated debates because nothing seems to fit the data. In science, that is not a good sign. But it is exactly what evolutionists have had — for over a century now.


There is no such thing as a settled theory of evolution. On that point, textbook orthodoxy is simply false. 


"Reputation management" and other euphemisms.

 Respectability is for sale. Here is a buyer’s guide. Names are omitted to protect the guilty from blushes and us from lawsuits 



PICTURE yourself as a big shot from an unpopular country—leader of an oil-rich bit of the Middle East, say, or a tycoon from a grungy bit of the former Communist world. You wish your family could shop, invest, socialise and study in the richest and nicest parts of the world (and flee there if needs be). But you don't deserve it and won't earn it: you will not stop torture, allow criticism, obey the law, or keep your fingers out of the public purse.


Luckily, respectability is on sale. You just have to know how to buy it. The place to start is London. Among its advantages are strict libel laws, which mean nosy journalists risk long, costly legal battles. And helpful banks, law firms, accountants and public relations people abound.


Laws on money-laundering have irritating requirements about scrutiny of new customers. This used to be merely an exercise in ticking boxes, but has got a bit tougher. Still, a well-connected and unscrupulous banker will be your best friend, for a fee. You cut him in on some lucrative transactions with your country or company. In return he will pilot you through the first stages, arming you with a lawyer (to scare rivals and critics) and an accountant (to keep your books opaque but legal) 

Next comes a virtuous circle of socialising and do-gooding. Start with the cash-strapped upper reaches of the cultural world: a big art gallery, an opera house, or something to do with young musicians. Donations there will get you known and liked. Or try funding a prize at UNESCO or some other international do-gooding outfit. Support causes involving war veterans or sick children. Sponsoring sport works too. But don't overdo it—the public is wiser than the glitterati, and will soon scent a crude attempt to buy popularity.


Send your children to posh English schools. Shower hospitality on their friends: they will be important one day. But invite the parents too: they are influential now. A discreet payment will tempt hard-up celebrities to come to your parties. Minor royals are an even bigger draw: British for choice, but continental will do. Even sensible people go weak at the knees at the thought of meeting a princeling, however charmless or dim-witted.


Many such titled folk like a lavish lifestyle but cannot earn or afford it. So offer a deal: you pay for their helicopters, hookers and hangers-on. In return, they bring you into their social circuit, and shower stardust on yours. You will need patience: the parties are dull and the guests vapid and greedy. Building your reputation as a charming and generous host may take a couple of years. But once people have met you socially they will find it hard to see you as a murderous monster or thieving thug. Useful props in this game are yachts, private jets, racehorses, ski chalets and mansions. 

Armed with social and cultural clout, you can approach money-hungry academia and think-tanks. A good combination is a Washington, DC, think-tank and a London-based university (Oxford and Cambridge, being richer, are also choosier about whom they take money from). The package deal should involve a centre (perhaps with a professorial chair) and a suitable title: it should include words like global, sustainable, strategic and ethical.


I stink, you think


On the subject of titles, expect an honorary doctorate for yourself and a PhD for your favourite young relative. This need not be an onerous undertaking. A lobbying firm can help with the research. Think-tanks' flimsier finances make them easy prey too—and they are more immediately influential than universities. Most of their experts are expected to raise all their own funds. A few million here or there is chicken feed for you but a career-saver for them and their programmes.


Sponsorship does not just make you look brainy and public spirited. It also skews the academic debate. If you are a pious Muslim, let it be known that a focus on uncontroversial subjects such as Islamic architecture, calligraphy and poetry will keep the money coming. Textual criticism of the mutually contradictory early versions of the Koran, by contrast, is a no-no. If you are from Russia, support cheerleaders for the “reset” in relations with America and pay for people to decry former Soviet satellites as irrelevant basket cases. If you are in oil or gas, pay for studies criticising the disruptive exercise of competition law on energy suppliers.


Then move on to the media. Generous advertising in the mainstream print dailies is a good way to make friends. Nobody will read the lavish supplements that trumpet your imaginary virtues and conceal your real flaws. But the newspaper's managers will be happy. It may be too much to expect them to get the journalists to tweak their coverage (though that can happen) but you will find it easier to put your point across. Sumptuous fact-finding trips are an easy way of making hacks' heads softer and hearts warmer. You can also hold conferences, with high fees for journalists who moderate sessions or sit on the panels. They will soon get the idea.


You are now in a position to approach politics. Most rich countries make it hard (or illegal) for foreigners to give money to politicians or parties. But you can oil the wheels. A non-executive directorship can be a mind-changing experience. Invite retired politicians and officials for lucrative speaking engagements and consultancy work: word will soon get around and the soon-to-retire will bear your interests in mind. Even better, set up an advisory council stuffed with influential foreigners. You need tell them nothing about what you do. Nor do you have to heed their advice.


Foreign respectability also makes you look good in the eyes of your own people. And it demoralises your critics, crushing their belief that Western media, politics, academia and public life are to be admired.


Your progress from villain to hero will not always go smoothly, especially if you have to start killing your opponents. But when the alarm is raised, your allies will rally to your defence. A tame academic can write an opinion piece; a newspaper grateful for your advertising will publish it. Your fans can always say that someone else is much worse and that you are at least a reforming, if not fully reformed, character. A few references to American robber-barons such as John Pierpont Morgan will bolster the case. So too will a gibe at less-than-perfect Western leaders such as Silvio Berlusconi. After all, nobody likes hypocrisy. 

Science is downstream from the design Inference?

The Relevance of Intelligent Design to Science and Society: A Primer 
Evolution News @DiscoveryCSC

This past summer, the Italian Center for Intelligent Design held its public launch at a conference in Turin, Italy. Following that event, Discovery Institute Vice President John West was interviewed by veteran Italian journalist and human rights activist Marco Respinti. The interview is being published this month in the Italian-language magazine Il Timone. Evolution News is pleased to publish the original English-language version of the interview, which discusses the history, impact, and relevance of the idea of intelligent design.

Dr. West is Managing Director of Discovery Institute’s Center for Science & Culture and author of the book Darwin Day in America: How Our Politics and Culture Have Been Dehumanized in the Name of Science. He is also editor of The Magician’s Twin: C.S. Lewis on Science, Society, and Society. 
            
RESPINTI: What is the “intelligent design” (ID) hypothesis? 
    WEST: Intelligent design is the idea that nature manifests clear evidence of purpose, planning, and foresight. In other words, nature reflects the brilliance of a master artist, not the haphazard results of an unguided process. 

RESPINTI: How does Darwinian evolution differ from intelligent design?
    WEST: Darwinian evolution sees nature — including human beings — as accidental byproducts of unintelligent matter and energy. According to Darwinism, “man is the result of a purposeless and natural process that did not have him in mind,” to quote the words of evolutionary biologist George Gaylord Simpson. In other words, nature is the result of an unguided process, not the creative activity of a master designer.  

RESPINTI: Why do debates over Darwinism and intelligent design matter to society? 
     WEST: In the Darwinian view, nature was created by blind unguided forces rather than a wise Creator, and humans are merely animals who are the unintended result of a process of “survival of the fittest.” Over the past century, this bleak view of nature and humanity has encouraged many abuses, including the denial of God’s existence, “scientific” justifications of racism, and efforts to breed humans like cattle through the so-called science of eugenics. The Darwinian view has promoted despair in many people, including young people, by portraying human life as an accident with no intrinsic dignity and no higher purpose.

By contrast, the intelligent design view upholds human beings as inherently valuable. Our lives have meaning and worth because we are the intentional result of a supreme artist and Creator. Humans are a masterpiece, not something cobbled together by an unguided process. In the words of former Pope Benedict, “We are not some casual and meaningless product of evolution. Each of us is the result of a thought of God. Each of us is willed, each of us is loved, each of us is necessary.” 

RESPINTI: What are the origins of the intelligent design idea? 
   WEST: Intelligent design is one of the foundational ideas in the history of human civilization. It has deep roots in the Jewish and Christian traditions as well as among non-Christian thinkers. In the Jewish tradition, both the Psalms and the Book of Wisdom speak of how nature reveals evidence of its Creator. In the words of Wisdom 13:5, “from the greatness and beauty of created things comes a corresponding perception of their Creator.” In the Christian tradition, Jesus, St. Paul, and the fathers of the church likewise argued that nature provides evidence of God’s wisdom, foresight, and artistry. For example, Theophilus, Bishop of Antioch in the second century AD, argued that God “is beheld and perceived through His… works,” which for him included the regularities of nature seen in astronomy, the plant world, animals, and ecosystems. 

Among non-Christian thinkers, we find a similar idea that nature displays evidence of purpose and foresight in Greek philosophers such as Plato, Roman thinkers such as Cicero, and medieval Islamic thinkers such as Al-Ghazali.  

RESPINTI: Did intelligent design play any role in the historical development of science? 
    WEST: Definitely. The idea of intelligent design provided a foundation for modern natural science. Because early scientists thought nature was the product of intelligent design, they expected nature to be orderly, purposeful, governed by laws rather than chaos, and understandable through human reason. These scientists’ belief in intelligent design spurred them to research the natural world.  

RESPINTI: That was in the past. What about today? Does intelligent design still play a role in science? 
WEST: Yes! Even today, scientific investigation proceeds because scientists assume for the sake of their research that the natural features they are studying are orderly and exist to fulfill a specific purpose. This is the essence of much scientific investigation — we treat things as designed so we can understand them. The reality is that intelligent design is a guiding assumption for scientific research even for scientists who claim not to believe in it. 

RESPINTI: What light do recent scientific discoveries shed on whether nature was intelligently designed? 
   WEST: The more we investigate nature, the more we see layer after layer of purpose and planning throughout nature. The laws of physics and chemistry are exquisitely fine-tuned to make life possible. Inside each of our cells, there exist sophisticated “molecular machines” that make human technologies appear primitive. At the foundation of life, we find DNA, which functions as a code directing many aspects of an organism’s development, just like computer software. Codes and information systems are hallmarks of mind — of intelligent design. Based on what we now know, it is very hard to conceive of the operations of nature without viewing them as products of intelligent design. It is little wonder that a Nobel Prize-winning physicist from Cambridge University recently declared that “intelligent design is valid science.” 

RESPINTI: What do you say to those who claim that Darwinian evolution has refuted the idea of intelligent design? 
    WEST: The evidence shows otherwise. First, Darwinism assumes that a universe fine-tuned for life already exists. It also assumes that the first self-replicating organisms already exist. So Darwinism can’t refute the evidence of design at the level of the universe or in the origin of the first life. It assumes those very things! Now Darwinism does claim that unguided processes can produce everything else. But we have a lot of data from experiments in bacteria that show just how little change unguided evolution can accomplish. Darwinian processes can produce small variations, but the major changes in the history of life — such as the origin of new body plans in animals — seem beyond the power of unguided evolution. Random mutations in DNA are supposed to drive Darwinian evolution, but we have learned that such mutations are usually either harmful or neutral to organisms. Mutations aren’t capable of producing major new biological features. Biochemist Michael Behe, molecular biologist Douglas Axe, and many other scientists have shown this. 

RESPINTI: Why, then, do so many scientists continue to embrace Darwinian evolution? 
     WEST: I think it is primarily due to culture, not science. The distinguished Italian geneticist Giuseppe Sermonti once called Darwinism “the ‘politically correct’ of science.” I think he was right. Many people continue to embrace Darwinism because it is fashionable. Others support it because they think it provides a scientific justification to reject God. 
    
RESPINTI: Where can you find scientists who support intelligent design? 
   WEST: Scientists and scientific groups that support intelligent design can now be found throughout Europe, in South America, in the Middle East, Asia, and Africa. In Italy, there is the Italian Center for Intelligent Design, which just held its public launch in June at a conference in Torino, where I had the privilege to speak.

In the United States, there is Discovery Institute’s Center for Science & Culture. The Institute is a non-profit organization founded in 1991, and its Center for Science & Culture was started in 1996 by historian of science Stephen Meyer and myself. The Center serves as a hub for the growing international network of scientists and scholars who think there is evidence of intelligent design in nature. The Center funds scientific research, sponsors educational programs, and produces books and educational videos related to intelligent design.  

Finally , representation for the most underserved community of all.

TONY DELUCA WINS REELECTION ...

Despite Death Last Month 

Longtime Pennsylvania state representative Anthony "Tony" DeLuca won in an Election Day landslide -- which has to sting for his opponent, because DeLuca's no longer living.


The late state rep. received 85% of the votes in Wednesday's midterm election -- despite dying in October from a battle with lymphoma.


The timing of DeLuca's death reportedly made it too late to pick a different Democrat candidate, or to reprint updated ballots.

The Pennsylvania House Democratic campaign committee addressed the issue online Tuesday evening -- saying a special election will come soon to fix the error -- but also thanked supporters, presumably for voting him in posthumously.


Many online are pointing to a lack of voter awareness as the reason DeLuca beat Green Party candidate Queonia Livingston ... with others speculating voters simply didn't want to vote for Livingston 


The Rubicon: a brief history.

Rubicon 


By The Editors of Encyclopaedia Britannica 

Rubicon, Latin Rubico, or Rubicon, small stream that separated Cisalpine Gaul from Italy in the era of the Roman Republic. The movement of Julius Caesar’s forces over the Rubicon into Italy in 49 BC violated the law (the Lex Cornelia Majestatis) that forbade a general to lead an army out of the province to which he was assigned. His act thus amounted to a declaration of war against the Roman Senate and resulted in the three-year civil war that left Caesar ruler of the Roman world. “Crossing the Rubicon” became a popular phrase describing a step that definitely commits a person to a given course of action.


The modern Rubicone (formerly Fiumicino) River is officially identified with the Rubicon that Caesar crossed, but the Pisciatello River to the north and the Uso to the south have also been suggested 

Wednesday 9 November 2022

Darwinism's conundrum: the objective distinction between natural and artificial selection.

 What’s “Natural”? Engineering Creates a Conundrum for Evolutionists

David Coppedge 

Are humans natural? Writing in PNAS, Juha Merilä comments on, “Human-induced evolution of salmon by means of unnatural selection.” According to his bio page at Research Portal, Dr. Merilä is Professor in the Organismal and Evolutionary Biology Research Programme in the Ecological Genetics Research Unit of the University of Helsinki. He begins, 

By modifying environmental conditions, human activities are generating novel selection pressures, which pose challenges to wildlife. When faced with novel selection pressures, organismal populations can respond to this through evolutionary adaptation, modifying their phenotype through plastic changes, or evading these new pressures by migrating to more beneficial environments. Otherwise, they will face loss of fitness and eventually, even extirpation. Although alteration of natural environments by humans has been long recognized as a potential source of novel and strong selection pressures, demonstrating human-induced evolution has proven to be challenging. 

The conundrum is obvious here. He refers to “unnatural selection” and “alteration of natural environments by humans” that demonstrate “human-induced evolution” — but why should this be any different from what beavers do to their environments? Are beavers natural? Are their dams natural? Many animals disrupt their environments; are those cases natural?


I do not know Dr. Merilä’s stance on human origins, but it is a safe bet that he (as an evolutionary biologist) denies intelligent design, and believes humans evolved from other natural animals. If so, it is strange to call anything humans do “unnatural.” There seems to be a subconscious recognition in his writing that humans are exceptional, and culpable for damage they cause to “natural” environments. Otherwise, “human-induced evolution” is plain old evolution by natural — not unnatural — selection. 

The Human Impact on Salmon 

In his commentary, Merilä is reflecting on work by Jensen et al. in PNAS that illustrated the impact humans have had on “natural” Atlantic salmon populations. At the end of his article, Merilä waffles on the “natural” designation and confers the term “agency” on what salmon did in response — a word that overlaps with intelligent design. One could say that beavers are the “agents” of “beaver-induced evolutionary change,” but they had no choice in the matter since the drive to build dams is instinctive, built into their genetic nature. In the case of the salmon, the agency could have been indirect, but Merilä seems to suggest that humans could have, and should have, left the natural salmon alone.  

Whatever the selective agent behind the observed body size decline, the study by Jensen et al. provides compelling evidence that the size change is genetically based, driven by natural selection, and clearly associated with human interference with their environment. While concerns about undesirable consequences of human-induced genetic changes in natural fish populations were raised in 1950s, a lot of the early research on this topic failed to provide evidence that observed phenotypic shifts have a genetic basis and hence, represent evolutionary changes. With this in mind, one of the major contributions of the work of Jensen et al. is in providing hard evidence for human-induced evolutionary change. 

The confusion remains. Are humans acting naturally when they become “selective agents”? Are humans thus culpable for the “undesirable consequences” of genetic changes to “natural” fish? Who decides what is “undesirable”? Evolutionists believe that many unsavory consequences of evolutionary change, such as extinctions, have occurred throughout natural history before humans emerged. That label undesirable is an ethically loaded word unique to humans. A “natural” beach stranding of whales might be undesirable for the whales, but highly “desirable” for bacteria.  

Still More Confusion 

The confusion is evident also in the Jensen paper. The last paragraph says, 

Our study provides unequivocal evidence of unintentional human-induced evolution in a natural population, in which the species managed to adapt to the altered environment. River Eira once harbored some of the largest salmon in the world but has now evolved into an ordinary salmon population. This successful adaptation comes at a cost of reduced life-history diversity, and, potentially, reduced population stability and resilience to further environmental change. 

“Unintentional” is an interesting word here. Design advocates identify intentionality as a discriminator between chance and design. If humans had intended to make the fish evolve, and it cost the salmon some of their adaptability and resilience, would that have been unethical? If not, then whatever happened was “natural” and hardly worth worrying about.


We can tell, however, that Merilä had more on his mind, because he praised the Jensen team for documenting human-induced evolutionary change which had raised “concerns about undesirable consequences” back in the 1950s. Now there is hard evidence. The concerns were valid. Humans are guilty as charged. I’ll have more to say on this tomorrow.



File under"Well said." LXXXV

 "He who knows only his own side of the case knows little of that" 

          John Stuart Mill