By Kevin D. Williamson
Monday, January
29, 2024
“Inflation is always and everywhere a monetary
phenomenon.” So said Milton Friedman, and so say his monetarist acolytes. If
you ask a particularly reactionary kind of economist (my kind of economist!),
he or she will tell you that what inflation actually means is
an increase in the money supply, that what we usually mean when we say inflation—a
general increase in prices—isn’t inflation at all, but only a consequence of
inflation. Inflation, from that point of view, means inflating the
money supply. You know—like this:
(via the Federal Reserve Bank of St. Louis) |
There are five ways in which
deflation is supposed to plunge the world into a spiral of economic
contraction. First, once deflation has started, falling prices will make people
put off spending, causing prices to fall further. Second, with prices falling
and the value of debt fixed in nominal terms, the real indebtedness of
households and firms will grow, acting as a drag on the market, as in Japan
since 1990. Third, nominal interest rates cannot fall below zero because
companies and households always have the choice of holding cash, which gives a
zero return. Banks cannot therefore offer interest rates below zero to
depositors so cannot charge negative nominal interest rates on loans. The
demand for loans will fall, shrinking the banking sector and the economy with
it. Fourth, because nominal interest rates cannot turn negative, central banks
will be powerless to offset the effects of deflation. Finally, with prices
falling and nominal interest rates stuck at zero, real interest rates will keep
increasing, turning the deflationary screw.
In fact, all of these supposed
effects either do not matter much, or are the result of inflation being lower
than expected, or happen because institutions have not yet adjusted to a
potentially deflationary world. They are not the inevitable result of falling
prices.
For example, we have experienced
falling nominal prices in computers and telecommunications for decades, and
although we may think twice before buying that new computer, we buy it in the
end. We are not putting off those long-distance phone calls at all. That is
because it is quite difficult to put off the consumption of many services. And
with services accounting for three-quarters of many advanced economies, most
activity will be protected from significant delays in purchases.
The Puritan in me rather likes the idea of a monetary
policy that rewards thrift and discourages indebtedness, but it is likely that
the Puritan in me is not a very good economist—he did not do me any good as a
struggling undergraduate, I can tell you that much—and that the moralizing
temptation often leads us into some bad policy postures. And so I generally
take the view that the wiser thing to do is to have the Federal Reserve pursue
price stability with a little bit of inflationary wheel-greasing, i.e.,
to do more or less what the Fed has been doing for the past several decades
with a reasonable degree of success. (Save your emails, please—my definition of
“a reasonable degree of success” accommodates performance that is far, far from
perfection.) My main complaint about the Fed is with its so-called dual
mandate, which marries the pursuit of price stability to the political priority
of full employment. Best to have the Fed do only the one thing and do it
relatively well rather than pursue two sometimes incompatible policy goals and
do so relatively poorly. I am enough of an optimist to believe that it is
possible to have a government (oh, I know, “independent”) agency that does one
thing well but am not superstitious enough to believe in government agencies
that do two things well, for the same reason I don’t believe in unicorns or
little green men or self-financing tax cuts.
A little bit of inflation is tolerable. Everybody is used
to it. As usual, the poison is in the dosage.
If you’d like an update on the rapidly deteriorating
fiscal position of the U.S. government, I highly recommend my friend Jonah
Goldberg’s recent Remnant podcast with
Brian Riedl, a senior fellow at the Manhattan Institute working the
debt-and-deficits beat. “Eat your spinach,” the subheadline says—it is pretty
good spinach. (Larry Kudlow used to refer to me as “an eat-your-spinach guy” on
debt and taxes, and I embrace the ethos.) Riedl is appropriately despairing,
noting that the deficit effectively
doubled in size in fiscal year 2023, that higher interest rates are driving
up the share of today’s spending to finance yesterday’s consumption, that we
already are near the point of debt-service consuming $1 out of every $3 in
taxes paid and are on our way to spending 90 cents of every dollar paid in
taxes on debt service, and that the politics are against doing what is needed
to put the federal government on more stable fiscal footing (and, hence, to put
the country on more stable economic footing), which will
necessarily include cuts to entitlement benefits (mainly in Social
Security and Medicare) and higher middle-class taxes. As Riedl says, the
politics of fiscal reform—which is to say, bipartisan cowardice and
stupidity—means that the most likely outcome is that nothing will be done until
the bond market forces action, either by demanding much higher interest rates
on U.S. government debt or by effectively refusing to lend the U.S. government
more money at whatever rates are on offer.
What happens then?
The usual answer given is “monetizing the debt,” which
really means “monetizing the unfunded liabilities.” Caught between two howling
mobs—the bond market on one side and Social Security beneficiaries on the
other—the federal government will have powerful incentives to try to finesse
its way out of the dilemma by making sure both the bond mafia and the oldster
mafia get paid by simply exnihilating money into existence. These being digital
times, Washington won’t even have to fire up the printing presses over at the
Bureau of Engraving and Printing. There are some technical
maneuvers involved but, basically, Uncle Sam can have an extra $100
trillion more or less by declaring that there’s an extra $100 trillion in the
Treasury—by creating money.
Which is to say, by means of inflation—in the older sense
of the word.
Americans have been learning to hate inflation (in the
popular sense of the word, meaning higher prices) after many years of not
thinking about it very much. It is interesting how modest an increase it takes
to get Americans’ attention. In 1984, the inflation rate was a little higher than
what it was for the 12 months ending in December 2023—it was 3.9 percent back
when Ronald Reagan was running for a second term and 3.4 percent as of
December. (Excluding food and energy, the rate in December was 3.9 percent; a
significant decline in gasoline and diesel prices made the overall number a
little better.) Of course, inflation had been
12.5 percent when Reagan was elected in 1980 and 13.3 percent the year before
that, so 3.9 percent inflation smelled like victory. But you can look at a
lot of years in U.S. history when inflation was as high or higher than it is
today and appreciate that those years are not remembered as being times of
economic crisis—and some of them were pretty good. Inflation was 3.4 percent in
2000, at the height of the boom that sustained Bill Clinton’s political
prospects, and 4.7 percent in 1968, when the economy was growing and Americans
were on their way to landing on the moon. On the other hand, if you look at
such fondly remembered economic times as the Eisenhower years, you’ll see very
low inflation: an annual average of only 1.27 percent from 1952 to 1961.
So there’s a lot going on when it comes to setting the
economic mood of the country, and inflation is only one factor among many. But
it is an important factor. Inflation during the first three years of Joe
Biden’s presidency (and here, please imagine that I have fully restated my
caveats about presidents
not being magical god-kings who determine economic conditions) ran
more like 5.6 percent. So today’s 3.4 percent is on top of an already inflated
baseline. “Things are getting worse, but the situation is not deteriorating as
rapidly as it was during my first three years in office” isn’t a hugely
confidence-inspiring sales pitch.
Recent sunny headlines notwithstanding, there are some
reasons to suspect that future inflation reports will be worse than the most
recent ones, not better. The difference between the Fed’s usual 2-percent
inflation target and today’s 3.4-percent inflation may not seem like a lot, but
it very well may be enough to cost Joe Biden reelection in November—never mind
that a lot of the inflation that we have seen in the Biden years was baked into
the cake during the Trump administration. As Riedl points
out, if you judge presidents by the new spending that was taken on because
of legislation they signed during their presidencies (as opposed to spending
driven by already established programs or economic fluctuations), then Trump is
the biggest offender in modern history: “President Trump signed legislation and
approved executive actions costing $7.8 trillion over the decade—compared to
$5.0 trillion for President Obama and $6.9 trillion for President Bush, and he
enacted these costs in just a single four-year presidential term, compared to
his predecessors’ eight years in the Oval Office.” Sure, Trump had the COVID-19
pandemic—and Obama had the subprime meltdown, Bush had 9/11, etc. Everybody in
politics has a sad story to tell about why he and his party had to spend so
much money. Events, dear boys, events, etc.
Everybody knew that under a hypothetical future scenario
in which the government attempted to monetize/inflate its way out of a debt
crisis, there would be a political price to pay. But now we have some
experience to suggest that an anti-inflation revolt would pick up steam right
around a (relatively) modest 5-percent to 7-percent level, nothing like what we
might expect in a scenario in which Washington is trying to suffocate its
Social Security, Medicare, and tax problems under a tsunami of rapidly depreciating
U.S. dollars wished into existence. The inflate-away-the-crisis model has
always assumed that monetization would be the path of least political
resistance, but there’s good reason to doubt that it would be. The problem is
that the sensible answer—and, really, it is ultimately the only answer—is a
painful bipartisan compromise that will represent the path of maximal
resistance until political calculation is entirely overtaken by events. Given
the way in which every interest group, ax-grinding society, and populist
demagogue attempts to “fiscalize” its pet issue—I reference here the people who
don’t actually give a fig about balancing the budget but
insist that we’d be well on our way toward solvency if only we’d starve the
welfare malingerers or round up the Mexicans or cut off aid to Ukraine or seize
Jeff Bezos’ assets, etc.—we can probably expect up-ratcheting fiscal pressure
to produce some genuinely imbecilic and dangerous policy responses ranging from
expropriation to attempted autarky. Every cheap demagogue has some kulaks he’d
like to see liquidated as a class.
Depending on how you add up the numbers, the current
unfunded liabilities for the federal government are between four and five times
U.S. GDP. Unfunded obligations for Social Security and Medicare alone now
exceed $600,000 per household. What that means, in a practical sense, is
that these obligations are unlikely to be met. And that’s fine—the whole idea
of entitlement reform is coming up with a plan to go about not meeting
those obligations but doing so in an orderly way. The average
net worth of a U.S. household is just over $1 million (the median is
just under $200,000), and there isn’t an economically or politically practical
way to seize 60 percent of Americans’ net worth to fund two programs. If you
think about it, doing so would be asinine: Seizing the majority of Americans’
wealth to fund what one reasonably straightforward income-support program and
one lightly disguised income-support program would be a poor policy in any
case, but in this case, an enormously
disproportionate share of that wealth and a disproportionate
share of the benefits are associated with the same people, the
nation’s wealthiest demographic: oldsters.
Good luck with that!
In Other News …
In spite of its bad politics and title that gives me
nausea, the best Clash album was Sandinista! You have to love
a triple album. Those of you who aren’t old enough to remember going into a
record store don’t know the feeling of walking in thinking you’re going to get
a single LP, like London Calling, and then finding out you’re
getting three. It’s surprising at first, but also delightful.
In Other Other News …
For some reason, Amazon really wants me to watch The
Three Musketeers: D’Artagnan. I am a fan of the story (and of Dumas in
general; The Count of Monte Cristo is one of my favorites),
and it’s the kind of movie you don’t have to twist my arm real hard to watch.
But the streaming-algorithm spirits seem particularly insistent.
In Other Other Other News …
Reading about another trio—cousins Jerry Lee Lewis, Jimmy
Swaggart, and Mickey Gilley, three extraordinary men, each in his own right,
who grew up together in Louisiana—I came across these
sentences:
Lewis and Swaggart are double
first cousins. Lewis and Gilley are first cousins. Gilley and Swaggart are
first cousins once removed.
I didn’t come from a large or close family, so cousin
technicalities have never loomed large on my personal radar. Double first
cousins are cousins who have both sets of grandparents in common, as opposed
to one set of
grandparents in common. That’s what you get when two brothers marry two
sisters. Lewis and Swaggart’s relationship was a little more complicated than
that: The Killer’s
father and the preacher’s grandmother were siblings. So, if I understand
how this works, they were double-first cousins once removed. “First cousins
once removed” aren’t exactly first cousins—they are a generation away from
being first cousins. So, your father’s first cousins are your first cousins
once removed, as are the children of your first cousins.
Jerry Lee Lewis, Jimmy Swaggart, and Mickey Gilley—what a
trio! Jerry Lee Lewis said the three were more like brothers than cousins. Must
have been something in the water up there. Like his musician cousins, Swaggart
was—among the many other things he was—a more than competent piano
player. Somehow, pianist doesn’t sound right in that context—Jerry Lee
Lewis was a piano player. He also infamously married a cousin—a first cousin
once removed—though the infamy had as much to do with the fact that she was 13
years old and he was 22 at the time of their marriage. She was, incidentally,
his third wife, what turned out to be third of seven, including one case of
bigamy and a seventh wife who had been a sister-in-law to that 13-year-old
third wife. These family trees will give you a headache. But, in line with our
curious three theme this week, Mickey Gilley was comparatively tame, with only
three wives.
(Swaggart just has the one wife—he was 17 when they
married, and she was 15—to complain about his escapades. Donald
Trump is a big Jimmy Swaggart fan. Takes one to know one, as they say.)
Economics for English Majors
See above! If that’s not enough for you, you’re going to
have to wait a week.
Words About Words
Do you want to know something that irritates me? (Of
course you do—you’re still reading, aren’t you?) Defensive journalism.
Defensive journalism is a result of the journalistic posture in which the
writer, whether a reporter or an opinion columnist, places himself too much at
the center of the story, and produces a lot of prose that isn’t intended to
tell the reader a story but to tell the reader how he or she should feel about
the writer. An example of this is Nicholas Kristof’s recent New
York Times essay on a childhood friend whose life went astray
in various bad ways and who committed a brutal crime. There’s a lot of
interesting stuff in the piece:
It was evening at a Circle K
convenience store, and Betty Gerhardt, 23, was working alone at the register.
Bill walked in and asked for some bacon, so she walked over to the cooler and
turned her back on him.
“He grabbed me and pushed me into
the back room,” she recalled, and he hit her over the head with a jar of honey,
knocking her to the ground. “He went for my pants. And when he did that, I
started fighting.”
Bill pulled her pants and
underwear down to her ankles, she said. Fearing she was going to be raped, she
struggled back furiously, even as Bill grabbed empty bottles stacked nearby and
smashed them over her head and body. The glass cut her badly—she had scars on
her forehead and arm from the attack—and he left her bloody and unconscious on
the floor.
After failing to break into the
cash register, Bill walked out of the store and drove off with his girlfriend.
Gerhardt awoke and called 911, and an ambulance rushed her to a hospital, where
she remained for three days. The police promptly caught Bill, still covered in
Gerhardt’s blood.
Deeply ashamed of what he had
done, Bill always claimed to me that he had been so high on meth, cocaine, and
alcohol that he was in “a stupor,” as he put it. “I blacked out,” he told me.
Arresting material. But there’s also a ton of “please
don’t think poorly of me for writing sympathetically about a man who did a bad
thing” prose in the piece, paragraphs of it:
Frankly, in writing this essay, I
worry that sharing details of this crime will leave the impression that this
horrific action represented all of who Bill was. He had another side full of
humor, warmth and eagerness to help others. Forgive me, Bill—for nobody should
be remembered for the worst thing he ever did.
I also fear that some readers may
believe that I’m minimizing a brutal assault, or will be perplexed that I
remained friends with a violent drug dealer who in many ways destroyed a young
woman’s life. I make no excuses for Bill or his actions. But one thing I’ve
learned in a lifetime of reporting is that humans contain multitudes, and in
this case I hope we might learn from Bill’s troubled journey how trauma
self-replicates: When we let so many Americans fall behind, they not only
suffer greatly but also inflict great suffering on others.
I get why some writers do this. The world is full of
intellectually dishonest people with axes to grind. (Ask me how I know.) A few
years ago, my friend Charles C. W. Cooke wrote a piece about a now-infamous
essay in which a man wrote about struggling with his sexual attraction to
children, which he describes as:
a preference for a group of people
who are legally, morally and psychologically unable to reciprocate my feelings
and desires. It’s a curse of the first order, a completely unworkable
sexuality, and it’s mine. Who am I? Nice to meet you. My name is Todd Nickerson,
and I’m a pedophile. Does that surprise you? Yeah, not many of us are willing
to share our story, for good reason. To confess a sexual attraction to children
is to lay claim to the most reviled status on the planet, one that effectively
ends any chance you have of living a normal life. Yet, I’m not the monster
you think me to be. I’ve never touched a child sexually in my life and
never will, nor do I use child pornography.
In response, Charles wrote:
How should we
treat such a person’s decision to talk about his affliction in public?
Honestly, I have no idea. Social taboos are important, of course. But
I do know this: Unless you believe that people “choose” to become
pedophiles—and I don’t—the author seems to be doing exactly what
he should be doing given his condition: Namely, a) accepting that he has an
unimaginably serious problem, and b) doing his utmost to refrain from acting
upon it. I am not a practicing Christian, but, as far as I can recall from my
instruction as a child, the author is taking precisely the approach that
Christians are supposed to take when they find themselves tempted toward sin. I
suppose that it is possible that I am seriously mis-remembering the core
tenets of the faith, but don’t followers of Jesus believe that everybody is
born with impulses that lead them toward unacceptable behavior? And don’t
they also believe that they are called to act chastely—that
is, to avoid indulging those impulses and instead to seek a way to be freed
from them? It was a while ago, I accept, but I cannot recollect any
caveats being attached to these rules. Are we now to suppose that it does not
apply when the propensity in question is sufficiently egregious? Is there
a new-fangled carve-out for instincts that turn our stomach? If there
is not, we might think twice before condemning a man for admitting he has a
terrible, terrible problem—even if we can’t move ourselves far enough in
the opposite direction to “understand,” to “support,” or to like him much at
all (and I can’t).
(I’d link to the whole thing, but I can’t find a link
that works. Don’t blame me!)
As you might expect, Charles was met by a barrage of
imbecilic and dishonest accusations that he was “defending pedophiles” or
attempting to “normalize pedophilia.” Which is, of course, absurd, and too
stupid really to remark on beyond observing the stupidity of the claim and the
dishonesty of the people who made it.
Sometimes, there is something to say about a bad thing
other than “Bad thing is bad.” Nobody thinks that Nicholas Kristof is in favor
of brutally assaulting store clerks or that he is indifferent to the victims of
violent crime. You can dislike his writing or his politics—you can dislike the
man himself—without talking yourself into believing something that stupid. And,
yet, he feels, probably with good reason, that he has to append several
paragraphs of defensive addenda to his column, lest he suffer—what? I don’t
know. The wrath of two dozen illiterate rage-monkeys on social media, I
suppose.
The digital age is a time of democratization and
leveling, and the populist spirit has reached even into the pages of the New
York Times. I suppose that has had some good effects, though I can’t think
of any off the top of my head, but it has robbed us of something very useful,
something morally and intellectually necessary for the columnist to have in his
arsenal: contempt for the contemptible.
The world is full of contemptible people. The media and
political world are especially full of contemptible people—Bethany
Mandel exists, and so does Baytya Ungar-Sargon, to cite a
couple of personally relevant examples of the sort of thing I’m writing about.
We don’t have to let them dictate the rules of how we write and talk—and we
shouldn’t.
In Closing
The first ultrasound of a pregnancy is always memorable.
It is usually a happy occasion, but it also is an anxiety-causing one: If you
are going to get bad news, an ultrasound is probably when you are going to get
it. And I am the sort of person who is always at least half-expecting bad news.
If I were ranking conversations I’ve had in my life by how much they surprised
me, first place would be one that went something like this:
“There’s a baby and a heartbeat!”
Relief. Gratitude. Big exhale.
“And there’s another.”
Big inhale.
“And—hold on a second, let me get the doctor.”
Brief terror.
“Doctor, are you seeing what I’m seeing?”
“Huh. Well, look at that.”
There’s a whole lot more to the story, which I’ll get
into in the future, but we have just welcomed identical triplet boys into the
world, meaning our little son has three new brothers and Pancake has a whole
new raft of problems tugging at her tail. Triplet pregnancies are complicated
(ours a bit more so than usual—again, more on the story at some future date)
and the boys will be in the hospital for some time, but the little ones and
their indomitable mother are doing well.
So, our cup runneth over. Also the spare bedroom.
Newsletters may be a little irregular in the coming
weeks.
No comments:
Post a Comment