Search This Blog

Thursday, 23 July 2015

On higher education VIII

Why "Work Smart, Not Hard" is the Worst Advice in the World

Work smarter, not harder? Don't tell it to Dirty Jobs host Mike Rowe, who meets some of the hardest-working people in America. In fact, he argues, that mantra is the opposite of the attitude we need to beat this lousy economy.




Sam Jones/Trunk Archive
When I was 17 my high school guidance counselor tried to talk me into going on to earn a four-year degree. I had nothing against college, but the universities that Mr. Dunbar recommended were expensive, and I had no idea what I wanted to study. I thought a community college made more sense, but Mr. Dunbar said a two-year school was "beneath my potential." He pointed to a poster hanging behind his desk: On one side of the poster was a beaten-down, depressed-looking blue-collar worker; on the other side was an optimistic college graduate with his eyes on the horizon. Underneath, the text read: Work Smart NOT Hard.

"Mike, look at these two guys," Mr. Dunbar said. "Which one do you want to be?" I had to read the caption twice. Work Smart NOT Hard?

Back then universities were promoting themselves aggressively, and propaganda like this was all over the place. Did it work? Well, it worked for colleges, that's for sure. Enrollments soared. But at the same time, trade schools faltered. Vocational classes began to vanish from high schools. Apprenticeship programs and community colleges became examples of "alternative education," vocational consolation prizes for those who weren't "college material."

Today student loans eclipse $1 trillion. There's high unemployment among recent college graduates, and most graduates with jobs are not even working in their field of study. And we have a skills gap. At last count, 3 million jobs are currently available that either no one can do, or no one seems to want. How crazy is that?

I think often about the people I met on Dirty Jobs. Most of them were tradesmen. Many were entrepreneurs and innovators. Some were millionaires. People are always surprised to hear that, because we no longer equate dirt with success. But we should.

I remember Bob Combs, a modest pig farmer who fabricated from scratch a massive contraption in his backyard that changed the face of modern recycling in Las Vegas by using the casino food-waste stream to feed his animals. He was offered $75 million for his operation and turned it down. He's a tradesman.

Then there was Matt Freund, a dairy farmer in Connecticut who thought his cows' manure might be more valuable than their milk, and who built an ingenious machine that makes biodegradable flowerpots out of cow crap. He now sells millions of CowPots all over the world. He's a tradesman.

Mostly, I remember hundreds of men and women who loved their jobs and worked their butts off: welders, mechanics, electricians, plumbers. I've met them in every state, and seen firsthand a pride of workmanship that simply doesn't exist in most "cleaner" industries. And I've wondered, why aren't they on a poster? Why aren't we encouraging the benefits of working smart AND hard?

The skills gap is bad news for the economy, but it also presents an opportunity. Last month I ran into a woman named MaryKaye Cashman, who runs a Caterpillar dealership in Las Vegas, and she told me they had more than 20 openings for heavy-equipment technicians. That's kind of astonishing. A heavy-equipment technician with real-world experience can earn upward of six figures. And the training program is free! But still the positions go unfilled? In a state with 9.6 percent unemployment? What's going on?


Courtesy Of MRWH


Here's a theory: What if "Work Smart NOT Hard" is not just a platitude on a poster? What if it's something we actually believe? I know it's a cliché, but clichés are repeated every day by millions of people. Is it possible that a whole generation has taken the worst advice in the world?

Look again at the image on the poster above, which I reproduced just the way I remember it. Those stereotypes are still with us. We're still lending billions of dollars we don't have to kids who can't pay it back in order to educate them for jobs that no longer exist. We still have 3 million jobs we can't fill. Maybe it's the legacy of a society that would rather work smart than hard.

Last month I launched an online campaign called Lessons From the Dirt. It's a modest attempt to get people talking about the skilled trades in a more balanced way. If you're not opposed to a little tasteful vandalism, check out my updated version of Mr. Dunbar's poster on lessonsfromthedirt.com. The image might amuse you, but the caption is no joke—Work Smart AND Hard.

I don't know if changing one little word in one stupid slogan will reinvigorate the skilled trades. I just think it's time for a new cliché. My own trade—such as it is—started with an "alternative education," purchased for a reasonable price at a two-year school. I suspect a lot of others could benefit from a similar road. So get a poster and hang it high. And if you see Mr. Dunbar, tell him I turned out okay.




 

Mathematics vs. Darwinism

A Mathematician's View of Evolution



In the Beginning Sewell.jpegIn 1996, Lehigh University biochemist Michael Behe published a book entitledDarwin's Black Box (Free Press), whose central theme is that every living cell is loaded with features and biochemical processes that are "irreducibly complex" -- that is, they require the existence of numerous complex components, each essential for function. These features and processes cannot be explained by gradual Darwinian improvements, because until all the components are in place, the assemblages are completely useless, and thus provide no selective advantage.
Behe spends over a hundred pages describing some of these irreducibly complex biochemical systems in detail, then summarizes the results of an exhaustive search of the biochemical literature for Darwinian explanations. He concludes that while biochemistry texts often pay lip-service to the idea that natural selection of random mutations can explain everything in the cell, such claims are pure "bluster," because "there is no publication in the scientific literature that describes how molecular evolution of any real, complex, biochemical system either did occur or even might have occurred."
When Dr. Behe was at the University of Texas El Paso in May of 1997 to give an invited talk, I told him that I thought he would find more support for his ideas in mathematics, physics, and computer science departments than in his own field. I know a good many mathematicians, physicists, and computer scientists who, like me, are appalled that Darwin's explanation for the development of life is so widely accepted in the life sciences. Few of them ever speak out or write on this issue, though -- perhaps because they feel the question is simply out of their domain. However, I believe there are two central arguments against Darwinism, and both seem to be most readily appreciated by those in the more mathematical sciences.
Little by Little
First, the cornerstone of Darwinism is the idea that major (complex) improvements can be built up through many minor improvements; that the new organs and new systems of organs which gave rise to new orders, classes, and phyla developed gradually, through many very minor improvements.
We should note that the fossil record does not support this idea. For example, Harvard paleontologist George Gaylord Simpson ("The History of Life," in Volume I of Evolution after Darwin, University of Chicago, 1960) writes:
It is a feature of the known fossil record that most taxa appear abruptly. They are not, as a rule, led up to by a sequence of almost imperceptibly changing forerunners such as Darwin believed should be usual in evolution...This phenomenon becomes more universal and more intense as the hierarchy of categories is ascended. Gaps among known species are sporadic and often small. Gaps among known orders, classes, and phyla are systematic and almost always large. These peculiarities of the record pose one of the most important theoretical problems in the whole history of life: Is the sudden appearance of higher categories a phenomenon of evolution or of the record only, due to sampling bias and other inadequacies?
An April 1982 article in Life Magazine, excerpted from Francis Hitching's book, The Neck of the Giraffe: Where Darwin Went Wrong, contains the following report:
When you look for links between major groups of animals, they simply aren't there..."Instead of finding the gradual unfolding of life," writes David M. Raup, a curator of Chicago's Field Museum of Natural History, "what geologists of Darwin's time and geologists of the present day actually find is a highly uneven or jerky record; that is, species appear in the fossil sequence very suddenly, show little or no change during their existence, then abruptly disappear." These are not negligible gaps. They are periods, in all the major evolutionary transitions, when immense physiological changes had to take place.
Even among biologists, the idea that new organs, and thus higher categories, could develop gradually through tiny improvements has often been challenged.1 How could the "survival of the fittest" guide the development of new organs through their initial useless stages, during which they obviously present no selective advantage? (This is often referred to as the "problem of novelties".) Or guide the development of entire new systems, such as nervous, circulatory, digestive, respiratory, and reproductive systems, which would require the simultaneous development of several new interdependent organs, none of which is useful, or provides any selective advantage, by itself?
French biologist Jean Rostand, for example, wrote (A Biologist's View, Wm. Heinemann Ltd., 1956):
It does not seem strictly impossible that mutations should have introduced into the animal kingdom the differences which exist between one species and the next...hence it is very tempting to lay also at their door the differences between classes, families and orders, and, in short, the whole of evolution. But it is obvious that such an extrapolation involves the gratuitous attribution to the mutations of the past of a magnitude and power of innovation much greater than is shown by those of today.
Behe's book is primarily a challenge to this cornerstone of Darwinism at the microscopic level. Although we may not be familiar with the complex biochemical systems discussed in this book, I believe mathematicians are well qualified to appreciate the general ideas involved. And although an analogy is only an analogy, perhaps the best way to understand Behe's argument is by comparing the development of the genetic code of life with the development of a computer program.
Suppose an engineer attempts to design a structural analysis computer program, writing it in a machine language that is totally unknown to him. He simply types out random characters at his keyboard, and periodically runs tests on the program to recognize and select out chance improvements when they occur. The improvements are permanently incorporated into the program while the other changes are discarded. If our engineer continues this process of random changes and testing for a long enough time, could he eventually develop a sophisticated structural analysis program? (Of course, when intelligent humans decide what constitutes an "improvement," this is really artificial selection, so the analogy is far too generous.)
If a billion engineers were to type at the rate of one random character per second, there is virtually no chance that any one of them would, given the 4.5 billion year age of the Earth to work on it, accidentally duplicate a given 20-character improvement. Thus our engineer cannot count on making any major improvements through chance alone. But could he not perhaps make progress through the accumulation of very small improvements?
The Darwinist would presumably say yes, but to anyone who has had minimal programming experience this idea is equally implausible. Major improvements to a computer program often require the addition or modification of hundreds of interdependent lines, no one of which makes any sense, or results in any improvement, when added by itself. Even the smallest improvements usually require adding several new lines. It is conceivable that a programmer unable to look ahead more than five or six characters at a time might be able to make some very slight improvements to a computer program, but it is inconceivable that he could design anything sophisticated without the ability to plan far ahead and to guide his changes toward that plan.
If archeologists of some future society were to unearth the many versions of my PDE solver, PDE2D, which I have produced over the last 20 years, they would certainly note a steady increase in complexity over time, and they would see many obvious similarities between each new version and the previous one. In the beginning it was only able to solve a single linear, steady-state, 2D equation in a polygonal region. Since then, PDE2D has developed many new abilities: it now solves nonlinear problems, time-dependent, and eigenvalue problems, systems of simultaneous equations, and it now handles general curved 2D regions. Over the years, many new types of graphical output capabilities have evolved, and in 1991 it developed an interactive preprocessor, and more recently PDE2D has adapted to 3D and 1D problems.
An archeologist attempting to explain the evolution of this computer program in terms of many tiny improvements might be puzzled to find that each of these major advances (new classes or phyla?) appeared suddenly in new versions; for example, the ability to solve 3D problems first appeared in version 4.0. Less major improvements (new families or orders?) appeared suddenly in new subversions, for example, the ability to solve 3D problems with periodic boundary conditions first appeared in version 5.6. In fact, the record of PDE2D's development would be similar to the fossil record, with large gaps where major new features appeared, and smaller gaps where minor ones appeared.2
That is because the multitude of intermediate programs between versions or subversions which the archeologist might expect to find never existed, because -- for example -- none of the changes I made for edition 4.0 made any sense, or provided PDE2D any advantage whatever in solving 3D problems (or anything else) until hundreds of lines had been added.
Whether at the microscopic or macroscopic level, major, complex, evolutionary advances, involving new features (as opposed to minor, quantitative changes such as an increase in the length of the giraffe's neck, or the darkening of the wings of a moth, which clearly could occur gradually) also involve the addition of many interrelated and interdependent pieces. These complex advances, like those made to computer programs, are not always "irreducibly complex" -- sometimes there are intermediate useful stages. But just as major improvements to a computer program cannot be made five or six characters at a time, certainly no major evolutionary advance is reducible to a chain of tiny improvements, each small enough to be bridged by a single random mutation.
Just Add Sunshine?
The second point is very simple, but also seems to be appreciated only by more mathematically oriented people. It is that to attribute the development of life on Earth to natural selection is to assign to it -- and to it alone, of all known natural "forces" -- the ability to violate the second law of thermodynamics and to cause order to arise from disorder. It is often argued that since the Earth is not a closed system -- it receives energy from the Sun, for example -- the second law is not applicable in this case. It is true that order can increase locally, if the local increase is compensated by a decrease elsewhere, i.e., an open system can be taken to a less probable state by importing order from outside.
For example, we could transport a truckload of encyclopedias and computers to the moon, thereby increasing the order on the moon, without violating the second law. But the second law of thermodynamics -- at least the underlying principle behind this law -- simply says that natural forces do not cause extremely improbable things to happen,3 and it is absurd to argue that because the Earth receives energy from the Sun, this principle was not violated here when the original rearrangement of atoms into encyclopedias and computers occurred.
The biologist studies the details of natural history, and when he looks at the similarities between two species of butterflies, he is understandably reluctant to attribute the small differences to the supernatural. But the mathematician or physicist is likely to take the broader view. I imagine visiting the Earth when it was young and returning now to find highways with automobiles on them, airports with jet airplanes, and tall buildings full of complicated equipment, such as televisions, telephones, and computers. Then I imagine the construction of a gigantic computer model which starts with the initial conditions on Earth 4 billion years ago and tries to simulate the effects that the four known forces of physics (the gravitational, electromagnetic and strong and weak nuclear forces) would have on every atom and every subatomic particle on our planet (perhaps using random number generators to model quantum uncertainties!).
If we ran such a simulation out to the present day, would it predict that the basic forces of Nature would reorganize the basic particles of Nature into libraries full of encyclopedias, science texts, and novels, nuclear power plants, aircraft carriers with supersonic jets parked on deck, and computers connected to laser printers, CRTs, and keyboards? If we graphically displayed the positions of the atoms at the end of the simulation, would we find that cars and trucks had formed, or that supercomputers had arisen? Certainly we would not, and I do not believe that adding sunlight to the model would help much. Clearly something extremely improbable has happened here on our planet, with the origin and development of life, and especially with the development of human consciousness and creativity.
References:
(1) See this New York Times article, for example.
(2) See this ENV article for more on the similiarities between the evolution of life and the evolution of human technology.
(3) An unfortunate choice of words. I should have said, the underlying principle behind the second law is that natural forces do not domacroscopically describable things that are extremely improbable from themicroscopic point of view. See "Entropy and Evolution," Granville Sewell, Bio-Complexity, 2013, for a more complete treatment of this point.