By Kevin D. Williamson
Monday, September 30, 2024
There’s
almost a surprising bit of good news in there if you squint: As hard as it is
to believe, Eric Adams is the first sitting mayor of New York City to be indicted
on federal charges. I’d have thought he’d be the fifth or the sixth or
something.
New
York, our greatest city, is also our exemplary city, both at its best and at
its worst. For 20 glorious years, New York showed the lesser American cities
what a real urban renaissance looked like, with an economic boom and a
complementary cultural revival. A generation of young people whose parents had
prioritized getting the hell out of Brooklyn or Upper Manhattan moved into the
city and turned around neighborhoods that once had been bywords for urban
dysfunction. And then, for most of the past decade or so, New York showed the
rest of the country how easy it is to piss away all those hard-won gains. Say
what you will about the pathetic figure Rudy Giuliani has become or the
weird-rich-dude obsessions of Mike Bloomberg, for a couple of decades there you
could ride the subway after midnight without feeling the need for Kevlar or
some kind of psychiatric SWAT team.
Mayor
Adams is accused of (blah blah blah,
innocent until proven guilty, yadda yadda yadda) accepting favors
from the Turkish government, gifts and benefits amounting to bribes for
favorable treatment of businesses connected to the regime of Turkish strongman
Recep Tayyip Erdoğan and the Turkish government itself. The allegations include
a straw-donor scheme for laundering foreign money into Adams’ mayoral campaign.
If the indictment is accurate, the whole thing was conducted with hilarious
incompetence and pettiness.
Longtime
observers of American municipal politics will not be entirely shocked by the
suggestion that a mayor applied some political pressure on behalf of a campaign
donor. That happens all the time, and it is rarely prosecuted as a crime. A
bribery charge requires a quid pro quo, and it is the pro that is the
problem even when the quid and the quo are caught on tape or
matters of public record: You can easily prove that the donation was made and
that the favor was done, but proving that the favor was done in exchange for
the donation is difficult unless somebody was dumb enough to put it in writing
or to allow himself to be recorded describing the bribe. The case against Adams
is a little easier to make, because those foreign donations would be illegal in
the first place and would require an illegal coverup.
Another
way of saying that is that what Adams is accused of is in no small part only a
more extreme version of business as usual in American municipal government.
There are things that can be achieved as readily through noncriminal means as
through criminal ones, and whether a given municipal official chooses criminal
or noncriminal means often is simply a pragmatic calculation. If you are a
clique of corrupt education bureaucrats who want to keep teachers from getting
docked for poor performance, you can either cheat on the standardized tests like they did in
Atlanta
and elsewhere (which is fraud) or you can just gut performance-based
compensation programs like they did in New
York
(which is politics, not crime). The mutually enriching revolving-door
relationship between elected office, public-sector unions, and politically
connected businesses in states such as California is gross and immoral, but
most of it isn’t anything you could charge as a crime.
There’s
a lot of obvious corruption in politics at all levels, but not all of it is
criminal: You can take petty bribes like Bob Menendez did, or you can set up a
foundation like the Clintons did and use it to pay your private-jet
bills,
an especially enticing proposition if you can count on a political ally such as
Barack Obama to quash any
investigation into possible financial wrongdoing. If Adams is in
trouble for doing illegally that which he might have done legally, his real
crime isn’t corruption—it is stupidity.
But
there’s a lot of that stupidity to go around: In municipal governance writ
large (local government per se, police departments, public schools, and
quasi-public agencies) there have been dozens and dozens of corruption charges
and convictions over the past several decades. Adams may be the first New York
mayor indicted, but over the years the feds have convicted mayors from
Newark (more than one of them), Hoboken, Secaucus, Jersey City, Camden,
Atlantic City, Gloucester, Passaic—and that’s just New Jersey!—to say nothing
of Miami Beach, Compton, Providence, Bridgeport, all the way down to White
Castle, Louisiana. You could fill a cell block with the Chicago aldermen who
have gone down on federal charges (at least 26 of them), and set up another one
for the Philadelphia judges and city councilmen, the Clark County (meaning Las
Vegas) county commissioners, the Boston school-committee members, etc. If past is prologue, we can
expect 4 out of 10 Illinois governors to go to the pokey.
I
prefer a policy of subsidiarity—of distributing both the responsibility for
mitigating social problems and the resources to work on those problems
to the most local and intimately connected institutions: We begin with family,
friends, and civil society, and then move on to government, beginning with
government at the local level. Subsidiarity is the right policy most of the
time (but not usually for national defense or international relations, which is
why our constitutional architecture reserves these to the national government)
but the fact that it is close enough to keep an eye on doesn’t keep local
government clean or smart. There is plenty of incompetence at the local level—tyranny, too.
There
are real American cities—lookin’ at you, San Bernardino—that have long been
governed by the most useless, weedy
thickets of human vegetation you could imagine. New York doesn’t
have to be one—it chooses to be.
Conflating
the Sacramental and the Sentimental
I
spent a fair bit of a recent episode of The Remnant
shouting at Jonah Goldberg, meaning shouting at the dashboard of my truck, and
wanting very much to throttle Sam Harris.
Harris
seems like a nice enough guy. He once sent me a gift basket out of the blue–I
was never quite sure what for. I don’t want to drop bombs on him … but.
I
don’t care that Harris is an atheist. Some of my best friends are atheists
(Hey, Charlie!) and many are somewhere on the atheism spectrum: atheist,
agnostic, Episcopalian. I do care that Harris sometimes talks like his head
apparently is entirely full of mush and insists on talking as though he is
ignorant—though he isn’t—of so many of the basic issues of religion, a subject
about which he speaks and writes a great deal.
I’ll
limit myself to one example here. Harris argues that science offers a better
way of doing things than religion does because science can self-correct,
whereas religion—in this case, he’s talking about Christianity—cannot do the
same, because its practitioners regard its founding documents as inerrant by
definition, so these cannot be changed in light of new facts. He offers as an
example the trials and executions of supposed witches, which is, in fact, a
terrific example in that it illustrates precisely the opposite of the point
that Harris is trying to make. Like most atheists in the Western world, Harris
has an essentially Anglo-Protestant sensibility (Christopher Hitchens could
have been a very happy Anglican) and, as such, he connects witch trials such as
the ones we had in New England … some centuries ago … to the Bible. But, as
anybody familiar with the history of magic in the Anglo-Saxon world would be
happy to tell you—and as Harris seems to know without understanding the
significance of the fact—the folk characters we call “witches” have almost
nothing to do with the “sorcerers” and “diviners” and such of the Bible, though
of course biblical injunctions against these were cited in witchcraft statutes,
notably in Alfred the Great’s lawbook. We don’t have very much in the way of
relevant records of the pre-Christian era in the British Isles (and not
especially good records of Britain pre-Norman Conquest, for that matter) but it
is very likely that our Puritans’ Anglo-Saxon forebears had witches on the
brain a thousand years or more before the birth of Christ. Belief in witches
and witchy characters is widespread across many cultures, as Harris himself
notes.
The
focus on Scripture is an admirable part of many Protestant traditions, and one
that my fellow Catholics would do well to learn from, but the Bible did not
create Christianity—it was quite the other way around, and whole generations of
Christians came and went before there was any such thing as the Bible as we
know it today. As Christianity spread, it absorbed certain flavors from the
terroir in which it was planted. British culture did not get its belief in
witches and witchcraft from British Christianity—British Christianity got its
beliefs about witchcraft from British culture, from pre-Christian, pagan
sources from which Christian culture has drawn so much (like Christmas trees
and Easter eggs). The oldest surviving charms from the British Isles aren’t
directed at seeking the aid of Satan—they petition Woden, because they were
invented by people who had never heard of Christianity, and their original
authors (I mean the authors of the underlying source material) probably lived
long before Jesus did. Harris concedes that belief in magic and witchcraft is
something close to a cultural universal but does not consider how this fact
makes an irrelevancy of his claim—an overblown claim—about the role of
foundational documents, and attitudes toward those documents, in the public
lives of religiously informed cultures.
(“Religiously
informed culture” is another way of writing “culture.”)
I’ve
had this experience before. I like Bill Maher, who is in real life a much more
gracious man than the character he plays on television, but I also think that a
guy who made a whole film about the implausibility of religious claims ought to
have taken the time to learn something about them. I remember a conversation I
had with him in which he was surprised to learn—and was very skeptical of the
claim—that the kind of biblical literalism associated with American Evangelical
Protestantism is a relatively new phenomenon, and that the Jewish sages of old
and Christian scholars of the Middle Ages took a distinctly different approach
to the interpretation of Scripture, as indeed do contemporary thinkers in many
different branches of Christian thinking. The notion that all of humanity is
descended from two original parents, for example, is far from universally held
and probably is a minority opinion among Christian intellectuals. That doesn’t
mean that these thinkers demote the Bible to the position of folklore, only
that they do not treat Scripture as a kind of magic in literary form. Christian
scholars even have a word for the quaint superstitious belief prevalent among
certain American Christians that one can simply crack open a Bible and expect
to be directed to a solution for any of the modern world’s particular problems
and dilemmas: “bibliomancy,” which, as the word suggests, is a form of magical
thinking.
Science
really is more useful than religion in the sense that a pipe wrench is more
useful than a bouquet of roses if you are trying to fix a pipe.
“Non-overlapping magisteria” is how Stephen Jay Gould described the
situation: different instruments for different things. The question of fact is
distinct from the question of what to do about the fact.
Which
brings us right back to the issue of witches and witchcraft: As C.S. Lewis very
amusingly argued, the error at work in those long-ago witch trials wasn’t a
moral error at all, but an error of fact. Lewis argues that if there
really were people doing what witches were accused of doing—using diabolical
powers to kill their neighbors or to make them ill, to cause miscarriages, to
cause crop failures, etc.—then they would be more deserving of severe punishment
than practically any other class of criminals: “quislings,” he called them, a
serious charge against the backdrop of World War II. The problem wasn’t the
moral judgment but the misjudgment of fact. The witches of the Anglo-Saxon
tradition are not real but are characters from the popular imagination.
And,
contrary to Harris’ claims and expectations, American Christianity did correct
that error. The last execution of a supposed witch in the United States
happened before there was a United States, way back in 1692. Lesson learnt—and
not because somebody published an important paper in Nature in 1693.
Europe learned, too, if a bit more slowly: The Swiss executed a supposed witch
in 1782 and the Prussians in 1811, though it is important to note that in
neither case was the condemned executed for—or even charged with—witchcraft,
that having ceased to exist as a criminal offense long before those cases. Of
course, in another sense the witch trials never really ended, we just stopped
calling them witch trials when they mutated into the Satanic-daycare cases of
the 1980s and 1990s—and it bears noting that the ritual-abuse panic was brought
to us with a big assist from Harris’ idol of choice, the scientific consensus,
kicked off by a graduate of the
McGill University medical school and carried forward by psychiatrists working
from approximately squat in the way of real evidence. Medical science, one of
the most relevant branches of science for many people, remains absolutely full
of quackery in our time, and, as with the case of religion, the quackery gets
quackier as it approaches political power, which is why we see things like the
Affordable Care Act entrenching the status of pseudoscientific “medicine” like
chiropractic and acupuncture.
The
scientific consensus in favor of eugenic homicide and sterilization, depending on how
you look at it, lasted either until about the day before yesterday in the
historical record or endures still today. The last execution of a supposed
witch in what is now the United States was nearly three and a half centuries
ago—while the most recent eugenic homicide conducted under the auspices of
science was probably about three and a half
minutes ago.
Harris
and others of his stripe ought to have the courage of their convictions—and
spare us the mushy-headed nonsense about “spiritual depth” and
“self-transcendence” and “the sacred” and the rest of the New Age goo he so
often traffics in. If we take the scientific view as the controlling view,
setting the limits of reality—and there is an excellent case for doing
so!—then there is no such thing as “spiritual depth” or “spiritual” anything,
because there is no such thing as spirit, only a kind of cultivated emotionalism with its origins in
neurochemical processes. There is no “self-transcendence” because the notion of
transcending the self is in that situation literally meaningless, there being
no state or situation to transcend into or toward—we are in that case talking
about nothing more than moods. There is no such thing as “the sacred,” only the
sentimental. All the psychedelics in the world aren’t going to change
that.
(Take
it from one who knows.)
Harris
argues that the secular world needs to develop better offerings to mark
hallmark events such as marriage and death—but why would there
be such a thing as marriage at all in the rationalists’ ideal world? And why
would we mark death, which should be understood as simply another ordinary
biological event, no more significant than passing gas? Why wouldn’t we instead
try to educate people out of their irrational belief that these events have
some transcendent (that word again!) significance? One suspects that so many of
these crusading atheists turn either to numbing hedonism (Hitchens and his
Johnnie Walker) or to pseudo-spiritual malarkey one step removed from the
horoscope page (Harris and his mindfulness) because they cannot bear to live in
the world they are constructing for themselves. That is intellectual
cowardice.
You
buy the ticket, you take the ride.
And
it’s not like we need some gooey new philosophy to deal with the situation:
Marcus Aurelius had this figured out 2,000 years ago: You live the life you
have, doing your best to do your duty as a reasonable man, then you die, at
which point there will either be an afterlife or annihilation, and, for the
purposes of living your daily life, it doesn’t much matter which, because your
legacy will fade and everybody who ever knew you will be dead soon enough, your
progeny will die out, and your name will be forgotten—so there’s no point in
getting very attached to the world, which isn’t very attached to you. That
isn’t a despairing nihilism, or, at least, it doesn’t have to be—Marcus’
version of Stoicism is a perfectly adequate philosophy for living a decent life
while we’re waiting around for the heat death of the universe. It’s a better
philosophy than the default mental setting of the median meathead meandering
about these fruited plains.
I
did not know Christopher Hitchens (I encountered him only once—in church,
strangely enough) but I read most of his sophomoric writing on religion, and I
never was convinced that he was an atheist at all. It wasn’t that he didn’t
believe in God—it was that he was angry at God and (much more understandably)
at God’s supposed representatives on Earth, like a grown man who never got over
the teenaged boy’s inevitable disappointment by his father, his discovery that
his church is full of imperfect people and at least a few active hypocrites,
the revelation that many of his teachers are just killing time waiting for
their pensions, etc. Hitchens was a brilliant man in many other ways, but, on
that subject, he wasn’t Diogenes—he was a permanent adolescent. Harris, by
contrast, is a guy with one foot in scientism and one foot in the guru
game—and, really, wasn’t one Timothy Leary enough?
(Terence
McKenna in his time already was a redundancy.)
If
you’re an atheist, go be a happy atheist. But shovel that happy horses—t
somewhere else.
Words
About Words
How
do you pronounce “Nazism”?
I
mean: How do you pronounce Nazism?
The
most common way among modern Americans, I think, is: not-zee-is-em, four
syllables. But people who speak a more precise English often say: nots-is-em,
three syllables. For comparison, think about Sufism: Here, most people use the
more correct-sounding (to my ear, I mean) soof-is-em, three syllables,
while some people use the clumsier-sounding soof-ee-is-em, four
syllables.
Nazi
is from the German Nationalsozialist. The “sozialist” part of that word
was sometimes abbreviated “sozi,” which literally means “social” but was
understood in context to mean “socialist,” which gave rise to Nati-Sozi
and thence to Nazi. With the -i already on there, it was natural—to many
English writers—to just drop the -sm on the end. But that wasn’t the universal
practice, and you can find literature from the 1930s and 1940s that spells the
word Naziism.
The
German is Nazismus, three syllables, not not-zee-is-mus. So I
suppose that argues for the three-syllable version in English.
And
then there’s the vowel issue: Most of us say not-zee, but in the 1930s
and 1940s, you hear English speakers more often saying nat-zee—or na-zee,
as Winston Churchill did. People who learned the word during the war years
often maintained that pronunciation throughout their lives, which would
sometimes be jarring in conversation.
More
Wordiness
Washington
Post
headline: “Tampa region emerges as epicenter of Florida’s death toll from
Helene.”
To
repeat: The epicenter is not the center. That’s why there’s an
epi- on the front of the word. Earthquakes have epicenters; hurricanes do not.
Somebody should tell the Washington Post.
Slate Is Edited by
Dumb People: A Series
A
headline: “Well, This Letter From a Billionaire Conservative Kingmaker Is
Downright Chilling!”
The
article is about Leonard Leo, the head of the Federalist Society. Leo isn’t a
billionaire. He’s a lawyer and a nonprofit executive. One of the organizations
he runs has more than $1 billion in assets that it can use to make grants. Leo
is no more a billionaire than the fellow who administers the endowment at
Princeton is.
Slate’s other take is that
Leo is a scary Catholic
maniac.
(You’ll not be surprised to learn that Opus Dei comes into the conversation.)
Maybe. But the reason his fund has more than $1 billion in assets to do things
with is that it received a very large gift from Barre Seid, a Jewish
businessman from Chicago who also donates to pro-Israel causes, orchestras and
operas, and to the School of the Art Institute of Chicago, among other worthy
causes. Real life isn’t The Da Vinci Code. But taking any meaningful
account of the facts would get in the way of the story being told in this
case.
Journalism:
How does it work?
The
article and headline were corrected after I sent my—what, 54th?—email
to Slate, which sometimes corrects its errors and sometimes tries to
brazen through them, which is always a mistake. I should start billing these
incompetents.
Economics
for English Majors
My
old National Review colleague Rich Lowry writes about the situation
in Springfield, Ohio:
The New York Times reports that
consultations began to take three times as long at the local community health
center. The head of the clinic told the paper, “We lost productivity. We had
huge burnout of staff.” It hired six Haitian Creole speakers, and annual
spending on translation services increased from $43,000 in 2020 to $436,000.
Rich
isn’t wrong about the numbers, but it is worth pointing out that that $436,000
is less than 2 percent of the institution’s budget. So, real money, but not an
extraordinarily heavy institutional lift, either. Scale matters. Proportion
matters.
In
Other Economic News …
Writing
in That August
Journalistic Institution, Oren Cass has offered up another headache-inducingly
stupid article in defense of industrial policy, in this case insisting that
tariffs—because Donald Trump loves tariffs, and somebody has to act as his
apologist—are a useful policy in that they address an “externality.”
Tariffs address a different externality. The
basic premise is that domestic production has value beyond what market prices
reflect. A corporation deciding whether to close a factory in Ohio and relocate
manufacturing to China, or a consumer deciding whether to stop buying a
made-in-America brand in favor of cheaper imports, will probably not consider
the broader importance of making things in America. To the individual actor,
the logical choice is to do whatever saves the most money. But those individual
decisions add up to collective economic, political, and societal harms. To the
extent that tariffs combat those harms, they accordingly bring collective
benefits.
Cass
is involved in the usual motte-and-bailey stratagem, of course. The question
isn’t “making things in America”—it is making these things in America at
this price. The idiotic chorus of “We don’t make things in America
anymore!” has never been louder—and, incidentally, it has never been more
untrue: In 2023, the export of U.S.-manufactured goods hit an all-time high at
more than $1 trillion; investment in manufacturing
facilities hit an all-time high earlier this year; total manufacturing output is at record
levels.
We
make all sorts of high-end things in the United States—airliners, Teslas,
etc.—and we make a lot of low-tech stuff too: You can buy an American-made
T-shirt from James Perse, if you are so inclined, though it might cost you a
couple hundred bucks. There are other made-in-the-USA
options at a more reasonable price, too. The question isn’t whether American firms
can manufacture a $200 or $50 T-shirt in the United States–the question is
whether we are going to deprive poor people of the option of buying a $5
T-shirt made in Bangladesh on the theory that Oren Cass’ consumer preferences
should be made mandatory.
More
generally: We aren’t having an argument about whether we should—and certainly
not about whether we can—manufacture things in the United States, a country
that accounts for about 16 percent of all manufacturing on Earth, in wild
disproportion to its share of world population. The question is whether we
should do political favors for lumber companies and microchip manufacturers,
asking American consumers to pay higher prices in order to make profitable
businesses more profitable and to subsidize their access to the U.S. market.
(One reason to manufacture in the United States is to be close to American
workers and American capital; another reason is to be close to American
consumers, which matters more for some products than for others.) Cass’ effort
to intellectually sanctify Trumpist anti-intellectualism notwithstanding, this
isn’t a question of national priorities at all—it is a question of political
favors and cronyism. Donald Trump most of the time is (contrary to his nature)
something approaching honest about that, that it is mainly a matter of
favor-trading.
We don’t grow a hell
of a lot of bananas in the United States. And, yet, we are just covered up with
bananas. Plentiful, cheap, delicious bananas. We get those bananas from places
that are better suited to banana cultivation than Idaho is. I don’t think Oren
Cass probably dreams of a world in which his children have the opportunity to
work on an American banana plantation.
If
you blockade someone else’s ports, it is considered an act of war. Cass et al.
would have us blockade our own ports, in order to protect Americans from—what?
Abundance and low prices? The potential loss of jobs working in flip-flop
factories now that we’re losing the flip-flop race to Vietnam or wherever? It
was dumb and dishonest back when the English land barons
(a lot of them literal barons) were trying to protect their markets from the
nefarious French
while hungry Englishmen were going without bread because of high prices.
If
you want to define “externality” that vaguely, you have a case for regulating everything.
Trump et al., being totalitarians at heart, are okay with that. But Americans’
traditional seven-letter answer to Washington busybodies who want to boss them
around, unprintable in this space, is the right one most of the time.
In
Conclusion
I
have written a great deal about actors and acting over the years, and the art
remains a mystery to me. The death of Maggie Smith raised the question in my
mind once more: How does this work? There are a million people, millions
of people, who could have delivered Maggie Smith’s lines in any of her
performances, but no one who could do it quite like her. What was it? It isn’t
really a matter of facial expression, or tone of voice, or the inflection in
the delivery, or anything you could really isolate. We don’t have good words
for whatever it is. It’s probably related to understanding faces, at least in
part, and we don’t have good words for that, either: Think of the face in this
world that is most precious to you, and then think of how you would try to describe
that face to someone who never had seen it. It doesn’t matter how articulate
you are, you couldn’t describe the face well enough that the person would know
what that face actually looks like.
My
own view is that the fundamental human tragedy is that we are all trapped in
our bubbles of radical subjectivity (“thinking of the key, each confirms a
prison”), and the spaces between us are mysterious and dangerous places. That’s
why shared experiences, and the performers who raise them to the state of art,
are so powerful. People who can cross those interpersonal chasms or play in the
spaces between us are rare and sort of magical. Even a very limited actor (say,
a guy like Jason Statham, who can really do only the one thing) or musician
(Hank Williams) has a kind of mysterious quality that is impossible to really
explain. Maggie Smith, of genuinely beloved memory, had an unusual set of gifts
and an admirable commitment to her work—and, fitting to the role for which she
probably is most famous, a touch of magic too.