Search This Blog

Wednesday 3 January 2024

The ministry of truth's public enemy no.1 David Berlinski holds court.

 

Yet more on why "junk DNA" is junk no more.

 Casey Luskin On Junk DNA’s “Kuhnian Paradigm Shift”


Prevailing scientific assumptions often die hard, especially when they fit so neatly into an evolutionary view of the development of life on Earth. On a new episode of ID the Future, Dr. Casey Luskin gives me an update on the paradigm shift around the concept of “junk DNA.” 

Luskin explains that intelligent design theorists have long argued against the idea that non-protein coding DNA is useless evolutionary junk, instead predicting that it serves important biological functions. Year after year for over a decade, new evidence has emerged revealing such functions and vindicating ID scientists. Luskin summarizes several recent papers that have found specific functions for non-coding DNA, such as regulating gene expression, controlling development, and influencing epigenetic processes. He then reports on the latest new evidence: the function of short tandem repeats (STRs), previously considered “junk DNA.” Luskin also discusses the work of molecular biologist John Mattick, who has written recently about the shift in thinking about “junk DNA.” Luskin suggests a new way of looking at non-protein coding regions of DNA and concludes that, far from junk, these “highly compact information suites” are essential and serve a variety of important functions in the genome. Download the podcast or listen to it here

Technology is more predictive re:biology than physics?

 Paper Digest: Standard Engineering Principles as a Predictive Framework for Biology


In 2017, professor of engineering Gregory T. Reeves and engineer Curtis E. Hrischuk published an open access paper in Journal of Bioinformatics, Computational and Systems Biology titled “The Cell Embodies Standard Engineering Principles.” They explained how the cell fulfills different sets of standard engineering principles (SEPs). This paper builds on Reeves and Hrischuk’s earlier publication that surveyed engineering models for systems biology. Once more these authors argue that engineering concepts can be used as a predictive and successful framework for biology.

Human designing and building have resulted in lists of standard engineering principles which must be followed to produce efficient, robust systems. These principles have been refined through countless engineering projects, and Reeves and Hrischuk demonstrate that these same SEPs are used in biology. They are therefore useful to biologists as an expectation framework for anticipating cellular systems:

The presence of engineering principles within the cell implies that SEPs can be used as starting point to formulate hypotheses about how a cell operates and behaves. In other words, we should pragmatically approach the cell as an engineered system and use that point of view to predict (hypothesize) the expected behavior of biological systems. We call this approach the Engineering Principle Expectation (EPE).

Several Categories of SEPs 

In the paper, several categories of SEPs are examined: general engineering principles (GEPs), hardware/software codesign principles (CDEPs), and robotic engineering principles (REPs). For each of these categories, the authors give specific examples of how the cell conforms to the set of SEPs. The authors also develop a non-exhaustive list of SEPS for chemical process control engineering (CPCEP), since a list was not available.

The comparison between cellular systems and engineered systems has strong implications for intelligent design. The reality that cells abide by the same engineering principles discovered in human design is highly significant. This finding is much better predicted on the hypothesis that biological systems have been intelligently designed than the alternate theory of a blind neo-Darwinian process giving rise to living systems.

For the category of general engineering expectations, the authors go over three principles in the main text. GEP1 states that the “development of engineered objects follows a plan in accordance with quantitative requirements.” The authors point out that the development of molecular machinery requires careful orchestration, including but not limited to decision-making, gene expression, protein synthesis, post-translational modification, the assembly of multicomponent complexes, and life cycle processes like cell division. Thus, cells embody GEP1. GEP2 states that “requirements are ranked according to cost effectiveness, and the development plan, which has an incremental structure, emphasizes the higher-ranked requirements.” This principle describes hierarchy, which results from top-down design where components are constructed, and resources expended in accordance with higher system goals. To give a biological example, the authors note the prioritization of ATP in the cell. GEP3 states that “standards are used where available and applicable with every departure from applicable standards explicitly justified.” Biological examples include how all different types of cells have conserved features such as the genetic code, ATP, a near universal central metabolism. Amino acids, nucleic acids, and some lipids might all be thought of as a cellular standard from which deviations rarely occur.

Hardware/Software Co-Design Principles

Next the authors discuss hardware/software co-design principles starting with CDEP1. This is the principle of “partitioning the function to be implemented into small interacting pieces.” In the cell, cellular regulatory networks can be decomposed into autonomous acting modules which cooperate to accomplish a function. Even the basics of cellular physiology, where unique macromolecular structures such as chromosomes, membranes, and ribosomes exist, implies partitioning of function into small interacting pieces. Thus, cells abound with examples of autonomous players carrying out a specific role towards a greater purpose. CDEP2 is the principle of “allocating those partitions to microprocessors or other hardware units, where the function may be implemented directly in hardware or in software running on a microprocessor.” This principle underlies the benefit of having a separate processer for each function. In computer systems, manufacturing constraints preclude this from being possible, but the authors point out that the cell is able to realize the ideal of having each protein or complex operating independently as a unique unit of hardware. 

Reeves and Hrischuk then describe REPs and CPCEPs. While going over each of those is beyond the scope of this article, the takeaway is that SEPs provide logic for understanding biological systems. By familiarizing themselves with these principles, biologists can enhance their research methodologies and improve their ability to predict and validate their experiments.

The Engineering Principle Expectation

Reeves and Hirschuk say that any complex system must adhere to SEPs. If it doesn’t, the outcome is catastrophic. Biological systems, which are more complex than any engineered system today, are not exceptions. When looking at a biological system, one should expect engineering characteristics. This can be thought of as the engineering principle expectation, a predictive model that can be used when looking at a biological system whose mechanistic details are not understood. Reeves and Hrischuk argue that it is crucial to apply engineering principles to understand and analyze biological systems. By doing so, researchers can gain insights into the underlying mechanisms and predict the behavior of these systems. Additionally, considering engineering principles can help in designing effective interventions or therapies for complex biological problems.

Theistic Darwinism is not an oxymoron? II

 Could Finely Tuned Initial Conditions Create Biological Organisms?


Is the arrangement of mass energy at the beginning of all things sufficient to account for the origin of life, the diversification of life, our capacity for abstract thought, volition, spiritual communion, and more? At present, there seems to be very little reason to answer in the affirmative.

However, theologian Rope Kojonen, in an attempt to wed design and evolution, allows for this interpretation in his recent book, The Compatibility of Evolution and Design. My colleagues and I reviewed the book in the journal Religions and have been discussing it further in a series here. The laws and preconditions of nature are at the heart of Kojonen’s model. They are his proposed mechanisms of design, the linchpin of his project. Yesterday, we looked at the first of three interpretations of how Kojonen’s model would actually work. Today we will look at the second:

The laws of nature simply transmitted biologically relevant information sufficient to produce all biological complexity and diversity, including new proteins, protein machines, and the like. This biologically relevant information was “built in” to the mass-energy configuration at the Big Bang. The laws of nature did not create anything but rather were the media (or “carriers”) through which biologically relevant information was eventually expressed and instantiated in everything from proteins to bacterial flagella to human beings.

A Helpful Analogy 

Laws have the capability of transmitting information in some situations, but they lack the ability to generate biological information of the kind found in DNA and proteins, as we’ve already discussed. Philosopher of science Stephen Meyer develops this point with a helpful analogy in Return of the God Hypothesis:

[I]magine that a group of small radio-controlled helicopters hovers in tight formation over the Rose Bowl in Pasadena, California. From below, the helicopters appear to be spelling a message: “Go USC.” At halftime, with the field cleared, each helicopter releases either a maroon or gold paint ball, one of the two University of Southern California colors. Gravity takes over and the paint balls fall to the earth, splattering paint on the field after they hit the turf. Now on the field below, a somewhat messier but still legible message appears. It also spells “Go USC.”

Did the law of gravity, or the force described by the law, produce this information? Clearly, it did not. The information that appeared on the field already existed in the arrangement of the helicopters above the stadium in “the initial conditions.” Gravitational forces played no role in causing the information on the field to self-organize. Gravity merely transmitted preexisting information from the helicopter formation to the field below.

The information in the message was encoded in the original position of the helicopters. The laws of nature (and gravity in particular) were merely the “carrier” of this previously created information. The second interpretation of Kojonen’s view agrees with this perspective of the laws of nature but then supposes that all the information necessary for the origin of life, diversification of life, and accounting for human cognition was present in the initial conditions (positioning of matter and energy). This concept is comparable to playing pool, where a single strike of the cue ball can knock all the balls into their respective holes due to their positions on the table. This mechanism is highly implausible when applied to life because the initial conditions that would have had to be established to create such a system are extreme.

Six Objections from Meyer

Additional problems plague this “initial conditions” idea. In Return of the God Hypothesis, Meyer summarizes six objections.

[G]iven the facts of molecular biology, the axioms of information theory, the laws of thermodynamics, the high-energy state of the early universe, the reality of unpredictable quantum fluctuations, and what we know about the time that elapsed between the origin of the universe and the first life on earth, explanations of the origin of life that deny the need for new information after the beginning of the universe clearly lack scientific plausibility.

Let’s explore this a bit more. To understand the absurdity of proposing that initial conditions could, without additional intervention, account for the facts of molecular biology, consider again the pool analogy. The idea that unfavorable thermodynamic events could be stacked into the initial conditions would be like supposing that after the cue ball hit one ball, three balls went in immediately, but after ten minutes, three more went in, and finally all the balls went into the holes. This scenario is scientifically implausible because our experience with the laws of nature is that they work consistently through time, and only agents that can work outside of the system are able to cause new events to occur. Thus, after a causation event, processes that must overcome thermodynamic barriers do not occur.

An Unknown Force in History

While the laws of nature can transmit information, the way they transmit it is consistent and constant. If the initial conditions could do something thermodynamically unfavorable after time has elapsed from an initial agent’s action, this would certainly be different from what we observe today. The laws would require the ability to select specific outcomes — i.e., to assemble specific molecules into these outcomes at specific points in time. This would require a process model running in the background and invoking the right actions at the right times — an unknown force that only seems to work at specific times in history. 

Tomorrow, we will look at the final possible interpretation of Kojonen’s model for the laws of nature.