Search This Blog

Monday 28 November 2022

On the tyranny of the expertocracy past and present.

Eugenics Movement Presents Remarkable Historical Parallels with “Gender-Affirming Care”

David Klinghoffer 

Wesley Smith and Jay Richards had a great conversation for the Humanize podcast on “What Every Parent Should Know About Gender Ideology and Gender-Affirming Care.” Identifying a remarkable historical echo, Dr. Richards says something I hadn’t thought about. Today’s strange trans ideology with its cruel medical interventions, including surgical mutilation, to affirm subjective gender identity bears a strong resemblance to the eugenics movement. The latter is now recognized as a malevolent and abusive force; but like evolution-based pseudoscientific racism, it was hailed in its day as the best and most responsible science, cheered on by the mainstream media, public school teachers, and the government. All that is true of our contemporary transgender ideology. 


There’s more. Endorsed by prestige academic opinion, eugenics focused on surgical sterilization for the “unfit.” Similarly endorsed by prestige opinion, transgender ideology welcomes the surgical removal of genitalia, and even provides “eunuch” as a new possible trans identify. In the case of eugenics, sterilization was coerced, not a matter personal preference. But as Richard also observes, pushing trans theory on vulnerable young children, molding their brains before they’ve reached the age of consent, is hardly giving them a free choice in how they think of gender. In a final parallel, it was religious people who were foremost in opposing the eugenicists and the pseudoscientific racists. John West makes this clear in his documentary Human Zoos (see it below). Today as well, many traditional religious perspectives resist the advances of trans activism. 


 

On the measurement problem.

Measurement problem

In quantum mechanics, the measurement problem is the problem of how, or whether, wave function collapse occurs. The inability to observe such a collapse directly has given rise to different interpretations of quantum mechanics and poses a key set of questions that each interpretation must answer.


The wave function in quantum mechanics evolves deterministically according to the Schrödinger equation as a linear superposition of different states. However, actual measurements always find the physical system in a definite state. Any future evolution of the wave function is based on the state the system was discovered to be in when the measurement was made, meaning that the measurement "did something" to the system that is not obviously a consequence of Schrödinger evolution. The measurement problem is describing what that "something" is, how a superposition of many possible values becomes a single measured value.


To express matters differently (paraphrasing Steven Weinberg),[1][2] the Schrödinger wave equation determines the wave function at any later time. If observers and their measuring apparatus are themselves described by a deterministic wave function, why can we not predict precise results for measurements, but only probabilities? As a general question: How can one establish a correspondence  between quantum reality and classical reality?[3] 

Schrödinger's cat 

A thought experiment often used to illustrate the measurement problem is the "paradox" of Schrödinger's cat. A mechanism is arranged to kill a cat if a quantum event, such as the decay of a radioactive atom, occurs. Thus the fate of a large-scale object, the cat, is entangled with the fate of a quantum object, the atom. Prior to observation, according to the Schrödinger equation and numerous particle experiments, the atom is in a quantum superposition, a linear combination of decayed and undecayed states, which evolve with time. Therefore the cat should also be in a superposition, a linear combination of states that can be characterized as an "alive cat" and states that can be characterized as a "dead cat". Each of these possibilities is associated with a specific nonzero probability amplitude. However, a single, particular observation of the cat does not find a superposition: it always finds either a living cat, or a dead cat. After the measurement the cat is definitively alive or dead. The question is: How are the probabilities converted into an actual, well-defined classical outcome? 

Interpretations 

The views often grouped together as the Copenhagen interpretation are the oldest and, collectively, probably still the most widely held attitude about quantum mechanics.[4][5] N. David Mermin coined the phrase "Shut up and calculate!" to summarize Copenhagen-type views, a saying often misattributed to Richard Feynman and which Mermin later found insufficiently nuanced.[6][7]


Generally, views in the Copenhagen tradition posit something in the act of observation which results in the collapse of the wave function. This concept, though often attributed to Niels Bohr, was due to Werner Heisenberg, whose later writings obscured many disagreements he and Bohr had had during their collaboration and that the two never resolved.[8][9] In these schools of thought, wave functions may be regarded as statistical information about a quantum system, and wave function collapse is the updating of that information in response to new data.[10][11] Exactly how to understand this process remains a topic of dispute.[12]


Bohr offered an interpretation that is independent of a subjective observer, or measurement, or collapse; instead, an "irreversible" or effectively irreversible process causes the decay of quantum coherence which imparts the classical behavior of "observation" or "measurement".[13][14][15][16] 

Hugh Everett's many-worlds interpretation attempts to solve the problem by suggesting that there is only one wave function, the superposition of the entire universe, and it never collapses—so there is no measurement problem. Instead, the act of measurement is simply an interaction between quantum entities, e.g. observer, measuring instrument, electron/positron etc., which entangle to form a single larger entity, for instance living cat/happy scientist. Everett also attempted to demonstrate how the probabilistic nature of quantum mechanics would appear in measurements, a work later extended by Bryce DeWitt. However, proponents of the Everettian program have not yet reached a consensus regarding the correct way to justify the use of the Born rule to calculate probabilities.[17][18]


De Broglie–Bohm theory tries to solve the measurement problem very differently: the information describing the system contains not only the wave function, but also supplementary data (a trajectory) giving the position of the particle(s). The role of the wave function is to generate the velocity field for the particles. These velocities are such that the probability distribution for the particle remains consistent with the predictions of the orthodox quantum mechanics. According to de Broglie–Bohm theory, interaction with the environment during a measurement procedure separates the wave packets in configuration space, which is where apparent wave function collapse comes from, even though there is no actual collapse.[19] 

A fourth approach is given by objective-collapse models. In such models, the Schrödinger equation is modified and obtains nonlinear terms. These nonlinear modifications are of stochastic nature and lead to a behaviour that for microscopic quantum objects, e.g. electrons or atoms, is unmeasurably close to that given by the usual Schrödinger equation. For macroscopic objects, however, the nonlinear modification becomes important and induces the collapse of the wave function. Objective-collapse models are effective theories. The stochastic modification is thought to stem from some external non-quantum field, but the nature of this field is unknown. One possible candidate is the gravitational interaction as in the models of Diósi and Penrose. The main difference of objective-collapse models compared to the other approaches is that they make falsifiable predictions that differ from standard quantum mechanics. Experiments are already getting close to the parameter regime where these predictions can be tested.[20] The Ghirardi–Rimini–Weber (GRW) theory proposes that wave function collapse happens spontaneously as part of the dynamics. Particles have a non-zero probability of undergoing a "hit", or spontaneous collapse of the wave function, on the order of once every hundred million years.[21] Though collapse is extremely rare, the sheer number of particles in a measurement system means that the probability of a collapse occurring somewhere in the system is high. Since the entire measurement system is entangled (by quantum entanglement), the collapse of a single particle initiates the collapse of the entire measurement apparatus. Because the GRW theory makes different predictions from orthodox quantum mechanics in some conditions, it is not an interpretation of quantum mechanics in a strict sense. 

The role of decoherence 

Erich Joos and Heinz-Dieter Zeh claim that the phenomenon of quantum decoherence, which was put on firm ground in the 1980s, resolves the problem.[22] The idea is that the environment causes the classical appearance of macroscopic objects. Zeh further claims that decoherence makes it possible to identify the fuzzy boundary between the quantum microworld and the world where the classical intuition is applicable.[23][24] Quantum decoherence becomes an important part of some modern updates of the Copenhagen interpretation based on consistent histories.[25][26] Quantum decoherence does not describe the actual collapse of the wave function, but it explains the conversion of the quantum probabilities (that exhibit interference effects) to the ordinary classical probabilities. See, for example, Zurek,[3] Zeh[23] and Schlosshauer.[27]


The present situation is slowly clarifying, described in a 2006 article by Schlosshauer as follows:[28]


Several decoherence-unrelated proposals have been put forward in the past to elucidate the meaning of probabilities and arrive at the Born rule ... It is fair to say that no decisive conclusion appears to have been reached as to the success of these derivations. ... 

As it is well known, [many papers by Bohr insist upon] the fundamental role of classical concepts. The experimental evidence for superpositions of macroscopically distinct states on increasingly large length scales counters such a dictum. Superpositions appear to be novel and individually existing states, often without any classical counterparts. Only the physical interactions between systems then determine a particular decomposition into classical states from the view of each particular system. Thus classical concepts are to be understood as locally emergent in a relative-state sense and should no longer claim a fundamental role in the physical theory. 

Further reading: R. Buniy, S. Hsu and A. Zee On the origin of probability in quantum mechanics (2006)