Sunday, June 3, 2018

The Scientistic Delusion


By Kevin D. Williamson
Thursday, May 03, 2018

Sigmund Freud’s reputation has never been lower. The scholar Frederick Crews and the rest of the so-called Freud Bashers have reduced his intellectual position to almost nothing among those who bother about such things. His theories were unscientific, his methods unsound, his evidence at least partially falsified, his ethics monstrous. He mutilated female patients by ordering dangerous and unnecessary surgeries based on pure quackery, e.g., removing part of a woman’s nose in order to treat pain from what was almost certainly an ovarian cyst. Freud thought that the patient bled as a result of sexual frustration. The more obvious explanation is that he was a butcher, and she was (as the case evidence suggests) a hemophiliac. Crews, who set out his findings in a 2017 book, Freud: The Making of an Illusion, speaks for many current Freud scholars in his conclusion that there “is literally nothing to be said, scientifically or therapeutically, to the advantage of the entire Freudian system or any of its component dogmas.”

But dogmas die hard. Crews himself is a former Freudian, and Freud’s few remaining defenders have suggested that his campaign against the Viennese showman represents—inevitably—a kind of academic Oedipus complex, the desire of the student-son to supplant his teacher-father. Crews, who is a pleasingly curmudgeonly writer, answered some of these claims in the Times Higher Education Supplement in 1995, writing:

In rendering their diagnoses-at-a-distance, my critics appear to have been guided by a principle that struck them as too obviously warranted to bear articulating—namely, that “Freud bashing” is itself a sign of mental dysfunction. They simply knew, after all, that Freud, despite some occasional missteps and out-of-date assumptions, had made fundamental discoveries and permanently revolutionized our conception of the mind. . . . Freud proved once and for all that unconscious beliefs and emotions play a large role in our behavior; that the human mind is at once capable of the clearest distinctions and the most devious twists and that mental illness stems in large part from an imbalance within the human being between real and ideal, between our rational and irrational selves, and between what we want to do and what we have to do.

These and similar formulations were noteworthy for their high quotient of generality and vagueness, approaching, in freedom from determinate content, the perfect vacuum achieved by the historian and Freud apologist Peter Gay, who has characterized Freud’s “central idea” as the proposition that “every human is continuously, inextricably, involved with others . . .” It is hard to dispute any of these statements about “humans,” but it is also hard to see why they couldn’t be credited as easily to Shakespeare, Dostoevsky, or Nietzsche—if not indeed to Jesus or St. Paul—as to Freud. Was it really Freud who first disclosed such commonplaces? Or, rather, has the vast cultural sway of Freud’s system caused us to lose focus on his more specific, highly idiosyncratic, assertions, to presume that a number of them must have been scientifically established by now, and to transform him retrospectively into the very personification of “human” complexity and depth?

Of course it is not the case that Freud “proved” that our waking lives are dominated by unconscious beliefs. It has not even been shown to any kind of scientific standard that the unconscious mind of Freudian thought even exists. His model of the mind—id, ego, superego—was metaphorical, a literary device rather than a meaningful explanation of actual mental mechanics. As many critics have pointed out, the main attraction of Freudian theories of the unconscious mind for quacks is that they cannot be scientifically tested. Writing in Psychology Today last July, David B. Feldman relayed an illustrative anecdote:

I once observed a lecture by a psychoanalyst who endorsed this classical view of the unconscious mind. Over the course of an hour, he explained that almost everyone harbors unconscious resentment toward their parents. When one of the students asserted that he personally didn’t harbor any such unconscious negative feeling toward his parents, the psychoanalyst replied, “See, that proves it’s unconscious!”

Those with a friendlier view of Freud argue that he should be evaluated not as a scientist but as a kind of cultural critic, a wide-roving mind that trafficked in the aesthetic and the mythological. Freud, as many critics have noted, was a fan of Sherlock Holmes and other literary detectives (he was his own Watson), and his case studies depict him as a Sherlockian figure, carefully uncovering the clues that are invisible to less-focused minds, those who see but do not observe. In places, Freud’s case studies read like short stories. And he himself at times at least hinted that what he was engaged in was more mythology than medicine: “One day I discovered to my amazement that the popular view grounded in superstition, and not the medical one, comes nearer to the truth about dreams,” he wrote in Dream Psychology: Psychoanalysis for Beginners (1899). And that, really, is what remains of the Freudian edifice: superstition, a kind of secular astrology for people too enlightened to be taken in by that sort of thing.

But Freud’s reputation was not staked on any claim to his being an aesthete or an interpreter of literature: He saw himself as a man of science, as heir to Copernicus and Charles Darwin. In Emil du Bois-Reymond’s famous formulation (often attributed, wrongly, to Freud), Copernicus took mankind down a notch by showing that the Earth was not the center of the universe, and Darwin took man down another notch by establishing that he is just another great ape with no special claim to having been specially created in the image of the Divine. Freud’s admirers taught for many years that the master had continued their work, taking man down a peg within the confines of his own mind, exposing him as a grubby, sex-addled bag of appetites hardly in control of his own life.

The sneering, vehement certitude with which Freudianism was preached until the day before yesterday is startling to revisit—and yet entirely familiar in tone to any reader of, say, Paul Krugman or Ezra Klein. Consider the introduction to the 1920 American edition of Dream Psychology by the long-forgotten Freudian André Tridon, who published well-regarded books about “gland personalities,” linking certain personality types and personality disorders to variations in the endocrine system in a model quite similar to the medieval account of “humors” and their effect on mood and health. Dream analysis, as Tridon puts it, was “the key to Freud’s works and to all modern psychology.” But it never quite caught on, and Tridon chalked up the resistance to the familiar factors: ignorance, suspicion of science, intellectual jealousy, an unwillingness to “sift data,” “laziness and indifference” to scientific achievement, an aversion to sexuality, “which puritanical hypocrisy has always tried to minimize, if not to ignore entirely.” Tridon was a true believer. “Besides those who sneer at dream study because they have never looked into the subject,” he wrote:

there are those who do not dare to face the facts revealed by dream study. Dreams tell us many an unpleasant biological truth about ourselves and only very free minds can thrive on such a diet. Self-deception is a plant which withers fast in the pellucid atmosphere of a dream investigation. The weakling and the neurotic attached to his neurosis are not anxious to turn such a powerful searchlight upon the dark corners of their psychology.

Tridon hit all the familiar notes: Freud did not argue that dreams were the key to understanding the human mind, he proved it. He insisted that what Freud was engaged in was science, full stop. “He did not start out with a preconceived bias, hoping to find evidence which might support his views,” Tridon wrote. “He looked at facts a thousand times until they began to tell him something.”

Freudian thought has gone from “established science” to obvious poppycock in a remarkably short period of time. There ought to be a lesson in that for the American news media. But the American news media are remarkably resistant to learning.

If there is a two-word phrase that should be excised from American journalism, it is “study proves.” A selection from Vox, probably America’s leading practitioner of the “study proves!” mode of rhetoric:

“We’ve been totally wrong about Hillary Clinton’s young voter problem, and a big new study proves it.”

“Harvard Business School study proves the humblebrag is a useless waste of time.”

“Yes, there is an echo chamber on Facebook, and this study proves it.”

Sometimes, Vox’s editors like a variation: “Study: Trump fans are much angrier about housing assistance when they see an image of a black man.” Over at the Afro, this claim became: “New Study Proves That Trump Supporters Are Racist, Infuriated by Black People. . . . In a study reported by Vox, Trump supporters can be set off just by looking at a picture of a black person, and just seeing one can significantly change how they feel about a policy. This was discovered thanks to a new study by political scientists at Colgate and the University of Minnesota, which shed light on some of Trump supporters’ most racist behaviors.”

If you are paying attention, you can observe the evolution of a half-truth in the wild.

But Vox’s penchant is widespread.

Audubon: “Study Proves Outdoor Science Education Improves Test Scores.” Study proves the thing we care about is valuable.

Quartz: “A huge new Stanford and Harvard study proves that U.S. inequality isn’t just about class.” Study disproves something nobody really thinks.

Huffington Post: “Study Proves This Is How Women Prefer To Orgasm.” Studies show that people click on these types of stories.

A study making the journalistic rounds for the past couple of years “proves” that the climate-change policies favored by progressives would pay for themselves (oh, how we love it when the things we want pay for themselves!) because the economic losses they would impose would be offset by savings in health care as cleaner sources of energy and a decline in industrial activity relieve millions of people around the world of respiratory stresses. The study—or, more precisely, the journalistic presentation of the study—assumes things that are literally unknowable.

The “study proves” model of argument is popular on both ends of the political spectrum, though it is leaned on more heavily by progressives, many of whom take seriously the delusional notion that they are beyond ideology, that they simply are relying on disinterested experts to guide them in the pursuit of “what works.”

Barack Obama says he told Raúl Castro in Cuba not to worry too much about choosing between socialism and capitalism, but to approach both like a buffet and “just choose from what works.” And, as it turns out—surprise!—everywhere progressives look a “study proves” that they should be doing whatever it is they already wanted to do.

Studies—and those holy facts and fact-checkers we’re always hearing about—are reliably subordinated to the social and political ethic of the party citing them. Take Vox’s cofounder, Ezra Klein, who writes with precisely the same faintly ridiculous certitude with which André Tridon presented the scientific facts of Freudian psychology. Klein’s hectoring, sneering, “just-the-facts” school of rhetoric is best exemplified by his indefensible claim during the 2009 debate over the grievously misnamed Affordable Care Act that Connecticut senator Joseph Lieberman was “willing to cause the deaths of hundreds of thousands of people in order to settle an old electoral score.” Klein had a study to back him—something from the Urban Institute. It didn’t exactly say what he was saying it said, and it certainly did not say that if Congress failed to pass a specific piece of health-insurance legislation that tens of thousands of people would die. Nonetheless: Study proves you have to support my policy preferences or you’re a mass murderer!

Studies have a way of ceasing to be studies once they are taken up by politicians-in-print like Ezra Klein. They become dueling implements. Mary Branham of the Council of State Governments: “Evidence Shows Raising Minimum Wage Hasn’t Cost Jobs” vs. Max Ehrenfreund of the Washington Post: “ ‘Very Credible’ New Study on Seattle’s $15 Minimum Wage Has Bad News for Liberals” vs. Arindrajit Dube of the New York Times: “Minimum Wage and Job Loss: One Alarming Seattle Study Is Not the Last Word.” Much of this is predictable partisan pabulum. The study that confirms my priors is science. The study that challenges my preferences is . . . just one study. Our friends among the global-warming alarmists, embarrassed by the fact that every time Al Gore shows up to give a speech it turns out to be the coldest March day in 30 years, are forever lecturing us that weather doesn’t tell us anything useful about climate—except when it’s hot in the summer, or there’s a drought in California, or there’s a hurricane in Florida.

***

‘Science Says You Have To Support Our Politics or Thousands/Millions/Billions of People Will Die!” is a well-established genre of American political discourse, one practiced most famously by Paul Ehrlich. Fifty years ago this month, he published, together with his wife, Anne, The Population Bomb, which predicted that “hundreds of millions of people will starve to death” in the 1970s. The decade was full of disasters—Nixon, disco, leisure suits—but none of the disasters was one that the Ehrlichs had predicted.

But the end is always near. Science says so, at least if you ask a scientist like Paul Ehrlich, who trained as a biologist but whose actual profession was that of rhetorically incontinent anti-capitalist. In 1980, Ehrlich made a bet with the economist Julian Simon, who had predicted “the cost of non-government-controlled raw materials (including grain and oil) will not rise in the long run.” The bet was over whether the prices for copper, chromium, nickel, tin, and tungsten (Ehrlich’s choices) would rise in 10 years. Ehrlich was wrong on every count.

Commodity prices fluctuate, of course, and there were 10-year periods in the postwar era in which Ehrlich would have won the bet. But Simon was on more solid footing: Commodities have been on a downward trend since the 1930s, and for solid economic reasons. When the price of a commodity rises, people invest more in producing it or seeking substitutes. In 1876, for example, the average yield of an American wheat field was 10.9 bushels per acre; in 2016, it was 52.7 bushels per acre. Durum wheat yields went from 3.8 bushels per acre in 1954 to 44 bushels per acre in 2009. Rising oil and gas prices made it economical to combine hydraulic fracturing with horizontal drilling to extract more petroleum out of deposits once considered depleted.

Ehrlich and his apologists insist that The Population Bomb was fundamentally correct—and fundamentally scientific—and that the dire predictions have only been delayed by unforeseeable developments such as the work of Norman Borlaug, whose “Green Revolution” brought high-yield varieties of grain crops to Mexico and then, notably, to India, where wheat production increased by 45 percent in a single year. Nobody could have predicted that, they say.

And that is the point.

The scientific project that goes under the broad heading of “complexity” considers the behavior of certain natural and social systems, which are not—this is key—subject to forecast. Some systems are chaotic and hence their behavior is “impossible, even in principle, to predict in the long term,” as computer scientist Melanie Mitchell puts it. The textbook examples of chaotic systems are markets and weather patterns. Complexity is an interesting subject, even when simplified for a mass audience. Natalie Wolchover’s recent Wired report on how scientists are using machine learning to predict the behavior of the Kuramoto-Sivashinsky equation, an “archetypal chaotic system,” is fascinating. Complexity theory connects modern science with classical-liberal political economy, particularly with the information problem made famous by F. A. Hayek and Ludwig von Mises, who argued that the nature of information in a complex society makes rational economic calculation under central planning an epistemic impossibility. In their analysis, socialism isn’t a bad idea but an impossibility.

Socialism, like Freudianism, came disguised as science and borrowed the prestige of science. The philosopher Pierre-Joseph Proudhon insisted that socialism was nothing more or less than “scientific government.” Neither socialism nor Freudianism has survived contact with reality quite intact, but both endure because they offer an identical promise: predictable progress. Freud believed that we could simply apply the scientific method (which he conflated with the method of Sherlock Holmes) to the cases of unhappy people with disordered lives and disturbed minds, illuminating the dreadful chthonic passions that were running and ruining their lives. The socialists believed that we could apply the scientific method to the entire productive activity of society, eliminating waste and “destructive competition,” that great bane of the late-19th century.

The scientistic delusion—the pretense of knowledge, Hayek called it—promises us that there is a way forward, that it is discoverable, and that it may be revealed to us by applying familiar, widely understood principles. The alternative—that minds and markets are beyond management—is for many too terrible to contemplate. The world beyond science is not only religion, it is also art and literature, which have been in notable if predictable decline as our increasingly timid culture defers ever more desperately to white coat-wearing figures of authority, demanding that they provide lab-tested, peer-reviewed, eternal answers to life’s every question.

Science, broadly defined, may inform our politics. It will not liberate us from politics. Nor will it liberate us from making difficult choices. And while the physical sciences have earned their prestige, the scientific consensus of any given moment may prove unreliable. Sometimes, what all the best people know to be true turns out to be a bizarre and embarrassing fantasy cooked up by an Austrian strange-o with a gift for self-promotion.

It pays to be cautious. You know it in your id.

No comments: