Saturday, July 31, 2021

A Tight Labor Market Is Good for Workers

By Kevin D. Williamson

Thursday, July 29, 2021

 

A fast-food CEO calls it a “total nightmare.” Some economists are calling it The Great Resignation. Best just to call it what it is: a tight labor market — and three cheers for it.

 

In late July, the United States had about 9.5 million unemployed people and about the same number of unfilled jobs. The unemployment rate was under 6 percent in the most recent survey, so one way of thinking about those 9.5 million unemployed workers and 9.5 million unemployed jobs is as churn at the margins.

 

That churn has been intensified by a tight little knot of interrelated factors.

 

The first of these is the partial liberation of workers who had been hunkered down during the COVID-19 crisis. Unemployment had hit almost 15 percent in the depths of the lockdowns, which made that a rotten time to be looking for a new job. On top of that, half of all Americans are in employer-based health-insurance programs — unemployment means losing not only income but also, in many cases, health insurance, with many unemployed workers lacking the means to pay for COBRA coverage. The fearsome prospect of losing both income and health insurance during an epidemic was enough to keep many dissatisfied workers in place rather than on the hunt for a new job — but that is no longer the case. Before the epidemic, the unemployment rate had been as low as 3.5 percent, and even if the labor market does not return to its pre-COVID situation, it has moved substantially in that direction, meaning that it is, for many workers, a seller’s market. There has been a great deal of disruption, which creates, in turn, a great deal of opportunity for in-demand workers. No one should expect them to sit still.

 

A second and related factor is the partial reversal of work-from-home arrangements in many sectors. Some businesses are better suited than others to having a workforce that does not report to an office every day, and many are eager to get their workers back at their desks. But the workers themselves are in many cases less eager. Many workers felt like working from home had given them, in effect, a raise: A worker with an hour-long commute has to invest ten hours in an eight-hour workday, whereas the same worker doing his job remotely can commit less time to the job even if he is doing more hours of actual work every week. The flexibility of being at home can provide enormous economic benefits, too, offering potential savings on everything from meals to child care. Some of these workers are demonstrating the limit of the theory of “sticky” nominal wages: A higher number on the paycheck does not necessarily translate into higher real wages once commuting and the expenses of being away from home all day are taken into account. A $20-an-hour job is not an improvement on a $17-an-hour job if, in order to take it, you have to pony up $10 an hour for child care — which, if we credit industry estimates, would be a bargain rate.

 

A third factor is that the very tight labor market leading up to the COVID-19 crisis already had some workers ready to demand more out of their employers, and those same workers surely are informed at least in part by the situation of workers in firms that thrived during the emergency. At the end of 2020, Amazon was hiring almost 3,000 workers a day. Right now, Amazon is advertising $20-an-hour jobs on these terms: “Benefits start when you do. No experience required. No résumé.” UPS has drivers earning more than $90,000 a year. Not everybody is in a position to take one of those jobs, but such positions are part of the scene, helping to give at least some workers — including those without much in the way of skill or education — the confidence not to take the first paying post that comes along.

 

So, yeah, Johnny Rockets is having a hard time of it.

 

Andy Wiederhorn, CEO of FAT Brands (corporate parent of Johnny Rockets and Fatburger), went on Fox Business in late July and described the labor situation facing his growing restaurant chain as a “total nightmare.” The existing restaurants are reasonably well staffed up, he says, but new locations are having a hard time getting help.

 

If only there were some time-tested strategy for attracting new workers, one that is supported by standard economic theory and that has proven itself effective in many different industries over centuries! President Joe Biden, asked about the shortage of service-sector workers, stage-whispered, “pay them more.” And though it is a little more complicated than that, the president is basically right: If your burger joint’s business model doesn’t work at prevailing labor costs, then your burger joint’s business model doesn’t work.

 

Wiederhorn is Exhibit A for the proverb that the problem with capitalism is capitalists: He once got paid $4.6 million over the course of 16 months he spent doing time in federal lockup for his role in what Forbes describes as “one of the worst pension frauds ever committed by an investment adviser.” His personal fortune is well north of $100 million, and he’s the kind of epic dingleberry who names his mansions (“The Ivy”) and boasted about spending $15,000 on a mirror in which to admire himself.

This is the doofus who is complaining that it is hard to find a reliable fry-guy — you want to know why your kids love Bernie Sanders? This is why your kids love Bernie Sanders.

 

Other executives seem to have received the message the market is sending. Beyond the obvious COVID winners (Amazon, FedEx, UPS, etc.), other companies representing several dissimilar sectors have raised their wages in the past year or so: Under Armour, Walmart, banks including Wells Fargo and Bank of America, fast-food chains including Chipotle and McDonald’s, Costco, Ikea.

 

Conservatives used to say that a job is the best social-welfare program. And it is. And there is nothing quite like a tight labor market for increasing labor’s share of income.

 

What should concern policy-makers is the considerable share of unemployment that is long-term and results from persistent mismatches between the skills of workers on the supply side of the labor market and the requirements of the demand side. The hardest jobs to fill in our economy are not positions for AI researchers or mechanical engineers and intellectual-property lawyers, but jobs for home health workers, truck drivers, warehouse workers, and management positions in food and hospitality. Another way of saying that is: Many of the people you’d want to hire as store managers for Fatburger have better options than being a store manager at Fatburger. As anyone who has worked in fast food knows, this is a longstanding challenge. But there is a considerable population of unemployed people out there who could, with the right incentives and preparation, fill some of those open roles. Many of these jobs fall into the gap between the positions that require a college degree and professional preparation and those that are truly entry level. You don’t necessarily need a degree or a certification to be a warehouse manager, but it isn’t a job you can just walk into, either. The same thing is true of retail and hospitality managers.

 

Another factor that should always be on our labor radar is that higher wages create incentives for firms to invest in automation and other labor substitutes. That isn’t necessarily something to worry about, but it is a reality to be taken into account. The good news for workers on that front is that the COVID-19 lockdowns illustrated in one of the most dramatic ways possible that dining and shopping are intensely social activities for which highly automated and technologically mediated experiences are a poor substitute. No doubt Jeff Bezos et al. will figure out a way to have the full menu of the Ritz Paris on all of our doorsteps with ten minutes’ notice, but many of us will still prefer going out for burgers with friends. For the past 15 years or so, the jobs that were considered the most secure for workers were those that were not readily subject to automation or offshoring: education, health care, government work, etc. We have seen some automation in fast-food restaurants, with customers ordering electronically at a kiosk at McDonald’s or placing their orders in advance via app at Starbucks, but it seems likely that this sort of thing will have only a limited appeal for customers who expect service with a human face.

 

What can we do for workers right now? The best thing we can do is to let markets work in their favor, which they are doing: The first quarter of 2021 saw the strongest wage growth in 20 years. And the market remains tight even with the winding down of COVID-era extended unemployment benefits. Beyond doing no harm, we have long-term policy failures to deal with, beginning with our dysfunctional K–12 education system and a federal government that somehow cannot manage to keep illegal immigrants out of the labor market. The wage effects of illegal-immigrant workers are felt most strongly at the low-paid end of the market, especially among workers who are recent legal immigrants.

 

And if this means that it is a more competitive world out there for restaurant owners looking to hire fry cooks, there’s a cure for that: Try money. It’s a simple enough solution that even Joe Biden has figured it out. 

J. D. Vance and the American Dream

By Matthew Continetti

Saturday, July 31, 2021

 

The author and venture capitalist J. D. Vance was a prominent voice on the national-populist right even before July 1, when he entered the crowded primary to replace GOP senator Rob Portman of Ohio. In a speech to the 2019 National Conservatism Conference in Washington, D.C., in appearances on Tucker Carlson Tonight, and in his active Twitter feed, Vance has promoted a “realignment” of conservatism away from libertarianism and toward an agenda that uses government to defend traditional values and improve living conditions for the noncollege educated voters at the base of the GOP.

 

Vance is a leader within that faction of the Right which says the conservative movement’s emphasis on individual freedom, and its commitment to the classical liberal procedures and “norms” of constitutional government, is responsible for its apparent failure to preserve the nuclear family, and for its exclusion from mainstream institutions. He is a pacesetter for this trend, which drew energy from Donald Trump’s victory in 2016. And because Vance represents one possible future for the American Right, I was eager to read the transcript of a speech he gave last weekend to the Intercollegiate Studies Institute’s “Future of American Political Economy” conference in Alexandria, Va. There is no doubting Vance’s smarts — he graduated from Yale Law School in 2013 — or his communication skills. But his text left me with questions.

 

Vance’s subject was the “American dream.” This is an infamously nebulous concept. Does the American dream refer to a process — the social mobility that allows the adopted son of an immigrant to fly into space on his own rocket? Or does it signify an end-state — the single-family home with a white picket fence in the cul-de-sac occupied by two parents, 2.5 children, and a dog and cat? No one really knows. For Vance, the American dream “is about a good life in your own country.” But it is also about being “a good husband and a good father,” who is “able to provide my kids the things that I didn’t have when I was growing up.” It’s a dream that Vance has achieved.

 

Then Vance contrasts his dream with another dream, a bad dream, the “dream of Mitt Romney.” This American dream, apparently espoused by “establishment Republican politicians,” is a dream of “private jets,” “fancy businesses,” and “a lot of money.” Such an emphasis on material wealth, Vance says, makes most people’s “eyes sort of glaze over.” After all, most people aren’t rich. Most people just “want to live a good life in their own country,” with their spouse and children.

 

Vance must not be on Mitt Romney’s Christmas card list. Last I checked, the former Republican presidential nominee and current GOP senator from Utah has been married to his wife Ann for over half a century, and has five sons and a countless number of grandchildren. Whatever your disagreements with him — and I have a few — Mitt Romney is a decent, patriotic, and accomplished gentleman who unquestionably has lived “a good life” in his “own country.” Yes, he is quite wealthy. He owns a number of homes. One of them had a car elevator. But it’s not as though Romney made his affluence the basis of his claim to high office.

 

On the contrary: It was former president Trump who grounded his appeal in 2016 on his “private jets,” “fancy businesses,” television celebrity, and considerable fortune. It was former president Trump who took kids for rides on his helicopter during the 2015 Iowa State Fair, who turned a campaign press conference at Mar-a-Lago into an infomercial for various Trump-branded products, and whose personal life, let us say, could not be more unlike Mitt Romney’s. Yet Vance casts Romney as the bogeyman in this contest of American dreams, and says he regrets voting for someone other than Trump in 2016. What gives? Not only did I end this section of the speech without a clear idea of what the American dream is or who best represents it, I was left wondering what factor other than his opposition to Trump actually prevents Romney from meeting the criteria that Vance sets out.

 

Vance says that “to live a good life in your own country, you have to actually feel respected. And you have to be able to teach your children to honor and love the things that you were taught to love.” No problem there; I couldn’t agree more. The danger of the culture wars, he goes on, is that the Left will force Americans into a posture of regret and shame over their history. The Left imposes costs on individuals — deplatforming, ostracization, cancellation — to police retrograde thought and behavior. “That is what the culture war is about.” And he’s right.

 

Then Vance says that because the only institution conservatives control, on occasion, is government, we ought to use political power to impose costs of our own on “woke capital,” “woke corporations,” and academia. Vance neglects to mention the various counter-institutions that the conservative movement built since World War II to address the problem he describes. Nor does he explain, exactly, how “breaking up the big technology oligarchy” would help men and women like his Mamaw. Even so, the idea that conservatives should use policy to further their conception of the public good is something of a truism. Everybody thinks they are furthering the good. The question, as always, is the means we employ to that end, and whether those means actually work. Government bureaucracy and regulation, for example, are not known for their contribution to human wellbeing (see: Centers for Disease Control). No matter who’s in charge.

 

At this point, however, Vance makes another statement that left me befuddled. “I’m going to get in trouble for this,” he says, but he goes ahead anyway and asks, “Why have we let the Democrat Party become controlled by people who don’t have children?” Now, he acknowledges, somewhat, that what he is saying is not strictly true: Joe Biden, Chuck Schumer, and Nancy Pelosi all have kids, and Biden, Schumer, and Pelosi control the Democratic Party and, at present, the national political agenda. Nevertheless, Vance name-checks Kamala Harris (who has two stepchildren), Pete Buttigieg (who, according to the Washington Post, is trying to adopt), Cory Booker, and Alexandria Ocasio-Cortez (who’s 31 years old). Vance understands, he says, that “there have always been people” who, “even though they would like to have kids, are unable to have them.” He has no problem with this population, he hastens to add, though he never stops to ask whether any of the four Democrats he singled out fall into it.

 

What bothers Vance is “a political movement, invested theoretically in the future of this country, when not a single one of them actually has any physical commitment to the future of this country.” He says, without supplying any evidence, that the reason the media are “so miserable and unhappy” is that “they don’t have any kids.” The collapse in American fertility, he goes on, is a crisis “because it doesn’t give our leaders enough of an investment in the future of their country.”

 

I agree that the decline in American birth rates is troubling, that “babies are good,” and that raising children is an indescribably worthwhile, utterly exhausting, and often infuriating experience (I have two). Children join us in that intergenerational compact which Edmund Burke described as the essence of traditionalist conservatism. No kids, no future.

 

But you know who else doesn’t have children? A lot of conservatives and Republicans. Maybe they can’t have them, maybe they’ll adopt, or maybe life just brought them to a different place. That doesn’t in one iota reduce their dignity as human beings, or their potential to contribute to America’s public life. And that goes for Democrats and independents, too.

 

William Rusher, the longtime publisher of National Review, never had children. Does his contribution to American politics count for less? Condoleezza Rice doesn’t have kids. Did that stop her from serving her country for eight years as national-security adviser and secretary of state? Lindsey Graham has no children. Has that prevented him from unswerving loyalty to President Trump? Pat Buchanan is childless — yet he formulated the arguments that define so much of national populism today.

 

Indeed, until a few years ago, the 53-year-old billionaire who donated $10 million to Vance’s super PAC had no kids. Should his contributions to political candidates and philanthropic causes during that time be retroactively judged suspect? The assertion that parenthood is somehow a prerequisite for effective statesmanship is nonsensical. It’s also insulting. Great parents can make terrible leaders — and great leaders are often terrible parents.

 

Vance says that the “civilizational crisis” of declining fertility requires providing additional “resources to parents who tell us the only reason they’re not having kids is because they can’t afford it.” How should we do this? “We can debate the policy details.” But the only specific proposals Vance mentions are Hungarian prime minister Viktor Orban’s subsidized loans to married couples who promise to have kids, and the completely fantastical idea of demeny voting, whereby parents vote on behalf of their children. What he doesn’t mention, as one of those sullen, devious, childless journalists pointed out, was either the child tax credit the Biden administration is sending to families as we speak, or the various other child credit plans advanced by Senate Republicans, including — wait for it — Mitt Romney.

 

How can it be that the same “establishment Republican” who represents such an unattractive version of the American dream also wants to make life easier for the working families in whose name Vance speaks? And while I am asking questions, what evidence is there that government spending can arrest, not to say reverse, a demographic process hundreds of years in the making? What special clarity and insight into the workings of politics do parents possess, and on what basis shall we implement the radical ideas that a Hungarian demographer came up with 35 years ago? What does the substance of Vance’s remarks actually have to do with the everyday concerns of Ohio Republicans? I found it noteworthy, for example, that immigration, crime, and “election integrity” don’t come up until the final paragraphs of Vance’s remarks. The word “inflation” does not appear at all.

 

Such is the confusion that arises when a movement anchors itself to the personality of one former president, when a movement neglects the principles of political and economic freedom that guided it for so many years. It seems to me that for national populism to have a viable future, it needs to avoid straw men, see its political antagonists not as alien enemies but as fellow Americans, concentrate on the issues voters care about, and clarify its thinking on the relation of economics and culture. Can J. D. Vance accomplish this formidable task? He has until primary day — May 3 — to try.

Bill de Blasio and the Decline of New York City

By John Podhoretz

Thursday, July 29, 2021

 

New York City is shrinking. Or rather: It was shrinking. Quite a while ago. Then it started to grow. Then it grew dramatically. But after eight years of Bill de Blasio as mayor, it is contracting once again, as the economic and population surge that took the city from the slough of despond to new heights over the course of four decades has been reversed. This is not the result of COVID. It is the result of a disastrous mayoralty and the ideas, prejudices, and idiocies that have animated it. De Blasio’s legacy as he prepares to leave office is just that: a city in decline.

 

Bill de Blasio has governed with a potent mix of old and new — the bad old and the horrible new. He has pushed wretched new ideas that have blighted the education system and poisoned the streetscape. And he has revivified incompetent policies driven by ideological priors — ideas so long discredited that their failure had been forgotten and had to be experienced yet again by young New Yorkers who weren’t alive when the city was nearly destroyed by them and were therefore unable to heed the warnings of those of us who did live through their nightmarish implementation.

 

To tell the story of de Blasio’s New York, we need to go back to the city’s great devastation.

 

In 1970, 7.9 million people lived in New York City. Ten years later, that number had dropped by a staggering 800,000. Over the course of the ’70s, residents voted with their feet and got the hell out of Dodge — fleeing an increasingly lawless and chaotic municipality whose feckless authorities stood by and let the place fester and rot.

 

This unprecedented depopulation was the consequence of a budgetary free fall that led the city to the verge of bankruptcy in 1975 — a managerial catastrophe that wreaked havoc on garbage collection, public safety, schooling, even on the grass in its parks. Its leaders, Nathan Glazer once quipped, stopped doing the things they knew how to do (like picking up the garbage) and started trying to do things no one knows how to do (like ending poverty). The expansion of social-welfare programs came at the expense of the prosaic quotidian tasks necessary if any city is to be livable.

 

Here’s just one example. In his book The Fires, Joe Flood tells the story of how Mayor John V. Lindsay (whose time in office ran from 1966 to 1973) sought to redirect city money so that he could spend it on social programs. He hired the RAND Corporation to study the city’s fire department: “NYC-RAND’s goal was nothing less than a new way of administering cities: use the mathematical brilliance of the computer modelers and systems analysts who had revolutionized military strategy to turn Gotham’s corrupt, insular and unresponsive bureaucracy into a streamlined, non-partisan technocracy.”

 

Using RAND’s efficiency experts and their findings as fodder and justification, Lindsay’s people closed dozens of fire stations because of supposed redundancies. Meanwhile, the department’s inspectors stopped ensuring the good working order of the city’s hydrants. The result: Enormous swaths of the Bronx burned down in the 1970s because there were no nearby fire trucks to put out the fires and no water in the hydrants when they did show up.

 

The staggeringly dark popular-culture portrayals of New York in the 1970s — Death WishTaxi Driver — didn’t feel excessive. They felt like documentaries. In 1974’s The Taking of Pelham One Two Three, subway hijackers demand $1 million for the safe return of their hostages. “This city doesn’t have a million dollars!” shouts the mayor. It was a joke, but it was no joke.

 

New York pulled itself out of its trough in part by electing in Ed Koch, in 1977, a mayor who rallied the city’s animal spirits over the course of a twelve-year tenure and in part owing to the explosion in Wall Street transactions set off by the bull market that began in 1983. The once-empty city coffers filled up again, and so did Gotham itself. More than 7.3 million people were counted in the 1990 census, a 3 percent increase over ten years.

 

* * *

 

Two astonishing changes altered the city’s fortunes. First, the three-decade crime surge was ended and a historic drop in crime rates eventuated, with previously unimaginable attendant benefits. Streets and parks and mass transit became safe again. And the streets that had come to look dull, old, and dirty instead came to sparkle with life and beauty and cleanliness. Second, the roaring economy turned Wall Street and the wealth it generated into the city’s ATM. In 1982, the city collected $1.3 billion in personal income taxes. By 2020, that number was $13.76 billion — a staggering tenfold increase, far exceeding the rate of national GDP growth. The city’s budget in 1982 was $15.6 billion. In 2020: $94.3 billion. Both budgets were balanced.

 

Between 1990 and 2000, New York’s population grew to 8 million — a 10 percent increase that brought the number back to 1970 levels plus a bit more for good measure. This was not supposed to happen. When cities start heading downhill, they are usually like boulders. St. Louis was the fourth-largest city in America in 1900, with nearly 600,000 people. Between 1950 and the present day, it lost more than 60 percent of its population and today is home to fewer than 300,000. That’s what normally happens when cities go into decline. New York’s recovery was historically unprecedented.

 

And that recovery built on itself. The New York renaissance that began with the election of Rudy Giuliani in 1993 and continued through the end of Michael Bloomberg’s three-term mayoralty two decades later was literally that. When babies were born in New York City, parents stayed if they could afford to instead of fleeing because they had to. An aging city suddenly turned younger. Areas that had lain fallow were reborn. Despite, or perhaps because of, the 9/11 attacks and the surge of citizen pride that accompanied them, the 2000s saw still more people flowing into the five boroughs. Half a million souls, to be exact; the population rose to 8.5 million.

 

New York might have been the most expensive city in the country to live in, a bastion of income inequality and every other sin of wealth that the young people who flooded to hipster Brooklyn to eat food twice as expensive as they would have paid for it anywhere else found time to worry over on their blogs — but as their presence indicated, it was the place to be.

 

* * *

 

And what of the decade dominated by Bill de Blasio’s eight years as mayor? When he leaves office in January 2022, the population of New York City will likely be around 8.25 million. He will not only leave office with the city in far worse shape than it was when he became its chief executive in 2014; he is the key cause of its renewed depopulation.

 

By almost every conceivable benchmark, even his own, de Blasio has failed. Take crime. Critics predicted that under his leader­ship, crime would skyrocket, and for a while it looked like we would have to eat our words as the crime rate continued to fall. In July 2019, de Blasio announced with great fanfare that the city had booked 40,000 fewer miscreants into jails that year than in the year he took office.

 

“The safest big city in America is ending the era of mass incarceration,” he said proudly. “For decades, we’ve been told we can only arrest and imprison our way to a safer city. Under my administration, New York City has proven that’s not true. Instead, we can keep fathers at home and kids in school and get even safer.”

 

By the end of 2019, the murder rate had risen by 7 percent, with other violent crimes also increasing at a comparably modest rate. Then, in 2020, everything went south. Shootings increased by 97 percent (that is not a typo), the homicide rate by 44 percent, the burglary rate by 42 percent, and the number of car thefts by 67 percent.

 

A year into his mayoralty, the city found itself awash in street dwellers, many of the newer indigents apparent opioid addicts who had moved into the city because it was an easy place to panhandle and because word had gone out that vagrancy would be tolerated. De Blasio accused the everyday New Yorkers who complained about the piles of garbage on Broadway and elsewhere of “fearmongering,” even as he increased spending on homelessness.

 

As usual, when you subsidize something, you get more of it — and in 2020, nearly 21,000 individuals were sleeping nightly in public shelters, an all-time high. When the vagrants are not in the shelters, they’re on the streets, sleeping or raging or rampaging, degrading the daily life of the city’s working residents and their children.

 

Education is de Blasio’s greatest shame. He has spent his mayoralty consumed with the notion of making “equity” the signature issue in the city’s public schools. He began by waging a war on charter schools, a war that was in part personal — he had had a long-running feud while serving on the city council with Eva Moskowitz, who left politics to start and run the stunningly successful Success Academy system, and wanted to destroy her. But he also loathes the notion that competition is the only way to improve public schools and is offended by the results that charters like Success Academy have shown.

 

He was prevented from running the Success Academy charters out of business by Governor Andrew Cuomo, whose psychopathic rage against any politician near his ambit who gets press attention led him to go for de Blasio’s jugular on this issue. But de Blasio has continued to do everything in his power to assert the primacy of equity over excellence and leveling over achievement. Though he poses as a tribune of the poor, his efforts to destroy both the city’s gifted-and-talented programs and the existence of eight selective high schools to which students gain entry by taking a single test have been a poisoned dagger aimed at the heart of one of the city’s least affluent groups: working-class Asian immigrants who push their children hard to excel in school so they can rise out of their struggling circumstances.

 

Whatever the damage done to the city by the pandemic, and it was substantial, it was nothing next to the depredations of Bill de Blasio.

 

* * *

 

That we would reach the end of de Blasio’s years in office this way was sadly predictable from the way he began his tenure in 2014. From the outset we were told, by de Blasio and by his fans on the left, that this was to be no ordinary mayoralty. There were ecstatic levels of expectation that de Blasio could, should, and would transform American politics at the national level. Bob Master, a union official, put it this way in the pages of The Nation just days before the mayor took office: “De Blasio will have an opportunity to chart an entirely new direction for municipal social and economic policy — forging policies explicitly designed to intervene in the economy and make it work better for the millions left behind during forty years of trickle-down.”

 

That was quite the messianic endorsement for a local politician of no particular distinction who found his footing amid a shockingly uninteresting and sedate 2013 Democratic primary field once the leading candidate, Anthony Weiner, proved psychotically unable yet again to resist the surpassing temptation to send naked pictures of his junk to women he had encountered on social media. He did so in part because the second major candidate besides Weiner, Christine Quinn, came under mysterious attack from a heavily funded super PAC whose sole issue was (I’m not kidding) ending horse-carriage rides in Central Park. De Blasio won an overwhelming victory, without question — but nobody voted. Democratic primary turnout was a staggeringly low 24 percent. Only 17 percent of eligible voters citywide participated in the November general. (He got 35,000 fewer votes in his 2017 reelection than he received in his 2013 win.)

 

Nonetheless, Master saw fit to declare, “His success or failure will have national ramifications.” And you know what? Maybe it did. It was certainly his own record that led potential 2020 Democratic presidential-primary voters to laugh de Blasio out of the race four months before a single vote was cast. And it may in part have been the example of what unbound progressive politics could do to a place like New York City that led ordinary Democratic voters in the 2020 primaries to leap into the arms of Joe Biden. New York was the most salient example of a local governmental approach that might be attempted on a national level by Bernie Sanders or Elizabeth Warren. Biden quickly became the only serious choice for Democratic voters alienated by the ambitions of the progressive wing of the Democratic Party whose tribune de Blasio had promised to be when he took office.

 

The Nation’s excited anticipation of de Blasio’s regime was understandable, since he had already begun to speak about himself and his goals for his office in ways designed to make any good American leftist think he had died and gone to Cuba. On this point, and not to digress, but I can’t help it: As a young man de Blasio himself had been an ardent supporter of the Cuban puppet regime in Nicaragua, delivering food and medicine in 1988 to the soldiers who, on behalf of the Sandinistas, fought anti-Communist rebels. Then, in 2019, speaking to striking Hispanic workers in Florida while running for president, de Blasio shouted, “Hasta la victoria, siempre!” — a Che Guevara slogan. In Miami.

 

Befitting The Nation’s hopes, de Blasio’s inaugural speech set an astonishingly grandiose tone. “We recognize a city government’s first responsibilities: to keep our neighborhoods safe; to keep our streets clean,” he said. “But we know that our mission reaches deeper. We are called to put an end to economic and social inequalities that threaten to unravel the city we love.”

 

Remember I told you that one of the most important issues in the 2013 primary was horse-carriage rides? Recall as well that turnout was astoundingly low. Add to this the fact that de Blasio’s key campaign theme had been ending the NYPD’s gun-seizure policy of stop-and-frisk. The moment that changed his fortunes was the release of a blockbuster commercial about police interactions with young black men starring de Blasio’s biracial teenage son, Dante. These are the things that really mattered when it came to getting him elected. But then, like the rebel in Woody Allen’s Bananas who announced, after becoming the president of his banana republic, that the national language would now be Swedish, Mayor de Blasio told the city and the world that he was going to use the powers of his office to . . . end inequality.

 

And not only that. “Today,” he boomed, “we commit to a new progressive direction in New York. And that same progressive impulse has written our city’s history. It’s in our DNA.” De Blasio loves to speak about New York City in this way. “New York has always been the center of progressive America,” he said on his hundredth day in office. “We weren’t sent to City Hall to change New York’s character. You sent us here to restore New York’s proud legacy as the progressive city.”

 

New York has never been the “center of progressive America” and has no “proud legacy as the progressive city.” It was Chicago in the late 19th century that pioneered the kind of early union activism that de Blasio likes to lionize. Later, in the opening decades of the 20th century, it was the state of Wisconsin, not New York City, that offered itself up as the working model for progressive governance. And for good measure, most leftist thinking since World War II has been a product and by-product of universities and university towns.

 

* * *

 

New York is not easily categorizable. It is a solidly Democratic city that went 75 percent for Clinton in 1996, and nearly 80 percent for Gore in 2000 and Obama in 2008 — all during years in which its citizens were happily governed by popular Republicans. Since the consolidation in 1898 of the five boroughs into the megalopolis we know today, the city’s leadership has seesawed between machine politicians (particularly Tammany Hall’s), hapless reformists nauseated by machine corruption, and skilled maneuverers (like Fiorello La Guardia and Robert Wagner) who were able to split the baby. And then there were the occasional sports, surprising anomalies like Ed Koch and Rudy Giuliani. The city’s practically minded electorate has always seemed allergic to utopian fantasies. Over the six decades before de Blasio took office in 2014, the city was ruled for 44 of those 60 years by four men — Wagner, Koch, Giuliani, and Bloomberg — who openly distanced themselves from conventional leftist ideology.

 

Leftists have been a part of the New York cultural landscape since the turn of the 20th century, without question. But it was not until the last few years that they have come to dominate the city’s political and social life. Before the advent of Alexandria Ocasio-Cortez, the most left-wing New York politician was a seven-term congressman from East Harlem named Vito Marcantonio, a Communist fellow traveler who served mostly in the 1940s and whose historical obscurity should give you a sense of how flaky and unimportant he was (and he was originally elected as a Republican!).

 

The true story of Gotham is a tale of complex and ever-changing ethnic, religious, racial, and social mixing — all of which has taken place in direct proximity to the wealth incubators and seed investors of the world’s largest economy. Aside from Wall Street, for much of the 20th century the most powerful mediating institution in the city wasn’t a labor union; it was the Catholic Church. And the most important nonpolitician in political life was Cardinal Francis Spellman. But the picture was far more variegated than that. At mid century, New York was a majority-Catholic city that was also home to the largest Jewish population in the world (more than 2 million people, or around 25 percent of the city’s population). Today, New York still maintains the most complicated ethno-racial mix in America: It’s 26 percent black, 26 percent Hispanic, and 12 percent Asian, with whites making up around 35 percent.

 

All this is to say that New York, the largest city in America and one of the largest in the world, is a city of particularities. It has an eccentricity about it that has never been reducible to any political buzzword like “progressive.”

 

Progressives hate particularity for the same reason they have a problem with patriotism: They are committed not to America’s advancement but to humanity’s. Typically, they cite the word “patriotic” only in conjunction with their passion for protest, as though the only reason to love their land is the permission it gives them to trash it. Borders cannot contain their passion for redesigning people and custom.

 

Bill de Blasio has taken this universalist principle and applied it in an interesting new way. He wants to subsume the story of New York within the broader mythology of leftism. His insistence on this false history, and his determination to impose an ideological framework on a peculiarly anti-ideological city, is the key to understanding why — despite his two terms — New Yorkers have never cottoned to the guy. And the feeling is mutual. He shares the progressive’s conundrum: He loves humanity in theory but he’s clearly not so crazy about people.

 

And he is the first mayor of New York City who seems to dislike New York City. It’s not just that he believes the city is a font of injustices, from what is (in his view) the unfair distribution of private incomes to the supposedly brutish behavior of its police (at least when he wasn’t running the joint) to the putatively money-grubbing conduct of its landlords to the mulish determination of local parents to seek a better education for their children by whatever means are at hand — whether that’s a gifted-and-talented program, or a charter school, or a selective high school where placement is determined entirely by the score on a test administered to whoever wants to take it. I mean, when you look at Gotham in this way, what’s there to like, really?

 

* * *

 

The most notable thing about de Blasio as a public figure is that he evinces almost no interest in the city’s traditions, quirks, and folkways. He plays no role in the life of the city, is never to be seen at local restaurants or attending the billion cultural events that take place on an hourly basis, or much of anywhere outside the unremarkable Park Slope gym to which he seems so fetishistically attached that he had his security detail drive him eleven miles every day to and from Gracie Mansion to work out on its machines.

 

He does not root for the city’s teams; indeed, he has stubbornly insisted on remaining a Boston Red Sox fan, which is a little like marrying into the Hatfield clan and then revealing that you’re a McCoy. Then there’s his bizarre discomfort with the civic rituals that have always been a special feature of the city’s public life. He has spent his mayoralty skipping out on them — the Columbus Day parade in the Bronx (2014), the Puerto Rican Day parade (2019), and, of course, the St. Patrick’s Day parade. He claims he avoids the latter owing to its supposed homo­phobia, but his habit of playing parade hooky in general seems more of a piece with his disdain for the city’s century-long embrace of ethnic particularism.

 

It’s the particularities that give New York its unique character — an indelible quality that inspires a fierce loyalty among its residents. New Yorkers have a kind of locally sourced patriotism you see maybe in Texas and hardly anywhere else. As John Updike once put it, “The true New Yorker secretly believes that people living anywhere else have to be, in some sense, kidding.”

 

Most mayors are cheerleaders for their cities; it’s part of the job description. But there’s nothing like a New York City mayor’s drinking deep of the myriad pleasures offered up by the five boroughs to give you a sense of New Yorkers’ local-patriotic fervor and passionate attachment to their city — why they cram into vastly smaller spaces than they might be able to live in elsewhere and put up with the inconveniences that come with living cheek by jowl with twice as many people as the next-largest city in America.

 

It was ever thus. Jimmy Walker, the colorful and corrupt chief executive in the 1920s, dominated the nightlife during Prohi­bition, traveling nightly from one speakeasy to another. The legendary Fiorello La Guardia loved the opera — indeed, loved it so much he incepted a “people’s” opera company and installed it in a gaudy Shriner’s temple he saved from the wrecking ball and turned into the New York City Center. Michael Bloomberg was one of the most generous donors to the city’s arts establishments, from museums to theater companies, and though it started as an affectation to show he was not just a billionaire disconnected from the people, Bloomberg  clearly came to love his daily subway rides from his Upper East Side manse to City Hall. Rudy Giuliani loved opera too. And the Yankees. And going to tapings of Saturday Night Live. And Ed Koch? He adored everything about New York. Every. Single. Thing.

 

New York is not in de Blasio’s blood, and he doesn’t want it to be. He came to the city at 27, fresh from his visit to Nicaragua, and his apparent determination not to drink deep of New York’s pleasures seems driven by the same ideological fervor that undergirded his not so youthful love of communism. He is driven instead by a conviction that the city itself is unjust and in need of moral and spiritual repair only he and his ilk can provide. No bread and circuses for Bill de Blasio — at least, not in public.

 

He does not like old-fashioned melting-pot ethnicity, but he likes newfangled racialism. And using it cleverly helped get him elected. In 2013, a friend of mine asked him how he planned to run against Bill Thompson, the well-liked city comptroller who had come shockingly close to beating Bloomberg in the 2009 election. Thompson was the sole black candidate in the race. No, de Blasio told him. I’m the black candidate. Married to an African-American woman and the father of two children with her, de Blasio pitched himself as the guy who knew from the experience inside his own household how difficult it is to be black in New York. “If his ‘Tale of Two Cities’ campaign theme turned off the city’s elites, he swept the emerging majorities, winning among communities of color and the city’s young white liberals,” wrote David Freedlander in New York magazine.

 

The very term “communities of color” is an indication of the sea change in political perception de Blasio represents. It’s one of his favorite phrases; I think it appears in more of his public comments than any other. It is, of course, an implicit effort to join African Americans and Hispanics in a grand coalition of grievance and need whose members can be spotted on sight. But, boy, does the word “Hispanics” (or “Latinos,” or “Latinx,” or whatever) have to do a lot of work here. These are people who don’t have all that much in common. The lion’s share of the city’s Hispanics is of Puerto Rican origin, and therefore American citizens from the get-go — as opposed to the Dominican and Mexican immigrants who together make up only two-thirds their number. According to an NPR poll in 2014, only 20 percent of Puerto Ricans in the continental United States still speak Spanish, so most of the city’s “Hispanics” don’t even share the same language.

 

The accurate way to think about these groups, again, is as ethnicities, not as a racial entity. In this, they join the “whites” of New York City, who would not have been recognized as “white” for most of the history of this country. Use the term “Caucasian” in New York and you’re talking not about Mayflower WASPs, who make up fewer than 1 percent. You’re talking about Jews (12 percent) and Italians (9 percent) and Irish (8 percent) and Germans and Greeks and Poles. I often think of my Yiddish-speaking milkman grandfather, who came to America from the Pale of Settlement as a teenager in the 1910s and lived the rest of his life in Brooklyn — imagine the shock on young Julius Podhoretz’s face had a fortune-teller informed him that one day he would be thought of as having shared a “race” with President Roosevelt and John D. Rockefeller!

 

* * *

 

De Blasio and the progressives need the “communities of color” alliance to give them strength in numbers. This is how they advance their case for redistributionist economics and a rebalancing of political power in ways that truly favor not the communities themselves but rather the broad ideological goals of so-called community organizers who leverage the self-proclaimed leadership of their subgroups. He has brought many of these organizers into city government, where they have acted more than a little bit like inmates running the asylum.

 

And so have de Blasio and his wife, Chirlane McCray. He “empowered” her campaign to put mental-health issues at the forefront of the city’s social-justice efforts with a program called ThriveNYC. All in all, more than $800 million has gone into ThriveNYC, an astonishing total for a single initiative. And it has been an abject failure, so much so that the program has been quietly rebranded and shoved inside another city department in preparation for de Blasio’s departure from office.

 

In the words of Stephen Eide in City Journal, “Instead of hiring more social workers, psychiatrists, and psychologists, the initiative focuses on drawing non-mental-health providers into the behavioral-health system. Examples include cops (CIT training), ‘School Consultants’ (who instruct parents and kids where they can find services in the community), and the public (Mental Health First Aid training). It’s still not clear how many separate programs make up ThriveNYC, but the original count was 54, which the administration touted, as though an initiative with dozens of programs is better than one with just a few.”

 

What you see in Thrive is what you see in de Blasio entire. He throws money at things. The money rains down on the city’s activist sector. He claims he has achieved revolutionary change and glorious results — but there are no credible statistics to back up his claims. That’s because he doesn’t need statistics. He has achieved a different set of results, the ones he really wanted: He has created a new class of government veterans — progressives with experience, who can leverage their time in the de Blasio administration in pursuit of their transcendent aims. He has helped build a new kind of machine, a leftist ruling class.

 

His likely successor, Eric Adams, is giving interviews in which he is all but guaranteeing he will follow not in de Blasio’s footsteps but rather in those of Bloomberg, Giuliani, and Koch — resolutely anti-ideological and focused on achieving results that will convince New Yorkers there will be a second renaissance. But the next mayor will have to contend not only with de Blasio’s legacy but with the army of progressives he empowered over his eight years in office. That army will be at the ready to fight back on behalf of the noxious ideas that are causing New Yorkers to vote with their feet and get the hell out of Dodge once again.

Wait, That’s the Risk of Symptomatic Infection That Has the CDC So Worried?

By Jim Geraghty

Friday, July 30, 2021

 

From this morning’s Washington Post story about the internal Centers for Disease Control and Prevention slide presentation that has a lot of people freaked out: “Another estimates that there are 35,000 symptomatic infections per week among 162 million vaccinated Americans.”

 

That comes out to 1 out of every 4,628.57 people. I like those odds!

 

I’m sure someone would say “yes, but that’s per week, meaning you face the same risk the next week!” Okay, so every week, I face a new metaphorical lottery of being that one person out of 4,628 or so who has a symptomatic breakthrough infection. I can live with that, and you can, too. Yes, it will stink to feel sick for a couple of days, but symptomatic breakthrough infection almost never results in hospitalization or death.

 

People accept that much higher level of risk all the time. The chances of dying in a car crash are roughly one in 107, and the average person is involved in three motor vehicle accidents in their lifetime. If someone told you they refused to ever get into a motor vehicle because the odds of dying in an accident were too high, you would urge them to get counseling for runaway anxiety.

Friday, July 30, 2021

The Scars of Utopia

By Kevin D. Williamson

Friday, July 30, 2021

 

Earlier this month, one ghastly chapter in a particularly ghastly story came to a kind of a conclusion as the Czech government agreed to make restitution to thousands of women, mostly members of the Roma minority, who were subjected to coerced sterilization by the Czechoslovak Socialist Republic.

 

The socialist strongmen of the 20th century differed in important ways from their progressive admirers in the United States and the rest of the free world, but they had some fundamental things in common: “Central planning” was never an idea that was limited to economic life, and the planned in “Planned Parenthood” is very much the planned from “planned economy,” meaning that the “planning” involved was to be at the social scale rather than merely at the family scale.

 

Eugenics and population control were obsessions of central planners from Moscow to Washington to Beijing, and, to some extent, they still are. Deng Xiaoping gave China its “one-child policy,” which Chinese leaders are today desperately trying to reverse as a declining birth rate pulls the country toward economic and military decline. Russian ideologues linked eugenics to the creation of the “New Soviet Man.” The geneticist J. B. S. Haldane, one of the founders of modern evolutionary science, was also a committed Marxist who argued in the pages of the Daily Worker that “the dogma of human equality is no part of Communism,” and insisted that dealing with “innate human inequality” would be the real “test of the devotion of the Union of Soviet Socialist Republics to science.”

 

(The main brake on eugenic excess in the Soviet Union was, of all things, Stalinism, which objected on ideological grounds to “biologizing” social issues.)

 

In socialist Czechoslovakia, doctors bribed those the state considered undesirable into accepting sterilization, misled them into believing that it was medically necessary, or simply performed the operation without women’s consent during other medical procedures, typically caesarian sections. Plus ça change: Planned Parenthood founder Margaret Sanger was a great enthusiast for sterilization, and, to this day, the organization markets sterilization to women on the grounds that it — and this is a direct quotation — “can even make your sex life better.”

 

Just as we know from colonial-era literature that even slaveholders such as Thomas Jefferson understood the intrinsic evil of that practice, we know from Czech records that many of those living under socialism saw the sterilization campaign for what it was: a prologue to genocide. The dissident group Charter 77, whose leaders included Václav Havel, denounced this genocidal work in plain terms at the time.

 

The program was supposed to have come to an end with the fall of the Soviet Union, but, in reality, it continued sporadically for years afterward. Earlier this year, Vice report charged that it is still going on. Again calling to mind the case of American slavery, this is an example of how a great evil can deform a society in ways that last for years, even generations, after the formal abolition of that evil and the legal dissolution of the institutions that carried it out. It is a superstition of democracy that to change the law is to change the world.

 

In the understanding of its adherents, socialism wasn’t just an ideology — it was science. “The science of the history of society, despite all the complexity of the phenomena of social life, can become as precise a science as, let us say, biology, and capable of making use of the laws of development of society for practical purposes,” Joseph Stalin wrote in 1938. “Hence, the party of the proletariat should not guide itself in its practical activity by casual motives, but by the laws of development of society, and by practical deductions from these laws. Hence, socialism is converted from a dream of a better future for humanity into a science.”

 

Science is an eager enough handmaiden to power, as Stalin knew. In his particular socialist paradise, dissent was medicalized and dissidents locked up in insane asylums where there was no practical distinction between therapy and torture. (More often, they were shipped off to the camps or simply shot in the head in one of Lavrentiy Beria’s dungeons.) Sterilization and other instruments of population control and eugenics have traditionally been directed at political dissidents and indigestible minorities such as the Uyghurs in China and immigrants in the United States. Apparently, many of the doctors who carried out the Czech sterilizations did so having been informed that it was the medically desirable, or at least the standard, thing to do after the birth of a second child.

 

At its most influential, eugenics shone with the prestige of science and commanded the loyalty of the cream of the Western intellectual world, from Sir Francis Galton to H. G. Wells to George Bernard Shaw. Malthusian cranks and fanatics such as Paul R. Ehrlich, author of The Population Bomb, remain fashionable in progressive intellectual circles today, and “overpopulation” remains a hot topic even as much of the world prepares to grapple with the challenges of population decline.

 

“I have seen the future, and it works!” declared the progressive journalist Lincoln Steffens upon returning from the Soviet Union. But those who lived under that “scientific” materialism saw things differently. The worldwide socialist enterprise killed something like 100 million people over the course of the 20th century, from the gulags and the Holodomor to the tens of millions who died in the Great Leap Forward. But not all of its victims were murdered. Some were only scarred, tortured, subjected to medical experimentation, exiled, or driven to suicide or another death of despair. Many of the women forcibly sterilized by the Czechoslovak Socialist Republic have died by now, of course, but there are survivors. They’ll be compensated with 11,000 euros each, if you were wondering about the price of being gutted on the way to utopia.

This Is What Critical Race Theory Looks Like

By Ari Blaff

Friday, July 30, 2021

 

There was one major reason why I dropped out of a prestigious grad school this past fall. It wasn’t the economic insecurity, the poor wages, or the need for geographical flexibility: Journalism isn’t much better. The simple fact I learned after half a semester studying sociology is that the discipline isn’t very tolerant.

 

Americans were reminded of this when sociology professor Sam Richards of Penn State University picked an “average white guy” and treated him like a dissected biology specimen in a packed lecture hall. “I just take the average white guy in class, whoever it is, it doesn’t really matter. Dude, this guy here. Stand up, bro. What’s your name, bro?” the middle-aged, and evidently hip, Richards asks. The bewildered freshman, Russell, stands at attention to make the visual experience easier for the gawking crowd. “Look at Russell, right here, it doesn’t matter what he does. If I match him up with [an identical] a black guy in class . . . and we send them into the same jobs, Russell has a benefit of having white skin,” Richards says.

 

In another clip, Richards points to a projected slideshow referencing a study in which job applicants are segmented by race and criminal record. The paper found that even whites with a criminal record were more likely to get call-backs than blacks without one. Richards then turns to the white student. “Bro, how does it feel knowing that push comes to shove your skin’s kind of nice?” Richards prods. “I don’t know, it makes me feel like sad cause like, God knows, I don’t deserve it. You know what I mean? Like, I didn’t choose to be white,” the student rambles.

 

What is edifying about Richards cornering a student, based on skin color, in front of hundreds of classmates? The show trial offered no academic value apart from humiliation. In an act of poetic blindness, Richards, who prides himself on having a viral TED talk entitled “A Radical Experiment with Empathy,” demonstrated a magnificent lack of empathy throughout the incident. Nor were university administrators all that bothered. Defending Richards’s conduct, the university released a statement that Richards and his colleagues “take time to discuss opinions from many perspectives — from liberal to conservative — and the classroom conversation is framed in a thoughtful way,” a spokesman noted.

 

The flavor of Richards’s lecture, described by the school as “an introductory class on race and culture,” as well as the administration’s equivocation, struck me as eerily similar to my own latest academic stint. Had I been sitting in the lecture, Richards could easily have pointed at me as the epitome of white privilege, although I identify as Jewish. Richards certainly would never have scoured the room for Chinese, KoreanIranian, or Indian students, even though members of such groups come from wealthier and, on average, better-educated backgrounds.

 

When all you have is a hammer, all the world is a nail; so, too, when one is devoutly anti-racist, all the world is racist.

 

This kind of treatment has become increasingly standard fare for students, particularly at elite universities. Following the murder of George Floyd and nationwide Black Lives Matter (BLM) protests last year, educational spaces are now confronting calls for a “racial reckoning” with the past. These “History Wars” have thrust once-esoteric academic debates into the public square. The stickiest of these is “critical race theory” (CRT), which views white supremacy as inextricably baked into the American pie.

 

Originated among legal scholars in the 1990s, CRT has become a catch-all term into which anti-racism, intersectionality, whiteness studies, and other progressive shibboleths have been thrown. It was brought to the mainstream’s attention by Christopher Rufo of the Manhattan Institute, and many on the left lay the blame at his feet for setting off a racial powder keg: “The proof lies offline in the new moral panic he helped instigate,” Sarah Jones of New York magazine writes. Critics view Rufo’s initiative as a crude crowbarring of various distinct theories under the CRT umbrella, but he has firmly countered such claims.

 

Regardless of what term one wishes to use, there has been a tangible shift, with CRT bleeding out of academic and cultural arenas and now corroding everyday discourse. “There is no in-between safe space of ‘not racist,’” anti-racist luminary Ibram X. Kendi writes. “The claim of ‘not racist’ neutrality is a mask for racism.” Accordingly, if you disagree with Kendi’s assessment of America or race relations, what does that make you?

 

The corollary of such thinking is that once the world is neatly divided into racists and anti-racists, it’s time to get the ball rolling. After all, those who are skeptical of such theorizing today are compared with anti-abolitionists and segregationists of yore. “In the 1950s and ’60s, the conservators of racism organized to keep Black kids out of all-white schools. Today, they are trying to get critical race theory out of American schools,” Kendi recently argued in The Atlantic.

 

Proponents of the unstoppable-march-of-history approach view opposition — dare I say skepticism? — as unmistakably standing athwart progress. Speaking before a gathering, Michelle Leete, a communications staffer for the Virginia Parent-Teacher Association, condemned opponents of CRT:

 

Let’s deny this off-key band of people that are anti-education, anti-teacher, anti-equity, anti-history, anti-racial reckoning, anti-opportunities, anti-help people, anti-diversity, anti-platform, anti-science, anti-change agent, anti-social justice, anti-health care, anti-worker, anti-LGBTQ+, anti-children, anti-health care, anti-worker, anti-environment, anti-admissions policy change, anti-inclusion, anti-live-and-let live people. Let them die.

 

As with Kendi, if one resists Leete’s perspective, one is seemingly anti-everything — in other words, part of the problem. More specifically, such dissidents require retraining to teach them and their children how to think properly.

 

Teacher Dana Stangel-Plowe publicly announced her resignation from a New Jersey private school in June on YouTube because of such initiatives. The school had embraced an ideology that “requires students to see themselves not as individuals, but as representatives of either an oppressor or oppressed group.” According to Stangel-Plowe, students self-censored, approaching assigned texts “in search of the oppressor.” Teachers at a February faculty meeting were even “segregated by skin color.”

 

In Illinois’s Evanston-Skokie School District 65, another teacher, Stacy Deemar, felt compelled to formally file suit in federal court earlier this month against the anti-racist encroachment within school life. Teachers in the district were also separated by race and mandated to participate in “privilege walks,” which the suit’s general counsel described as conditioning teachers “to see one another’s skin color first and foremost.” Such thinking, understandably, flowed downstream to students. Lessons distributed to eighth-graders in the district included assertions that “white people have a very, very serious problem and they should start thinking about what they should do about it.”

 

* * *

 

Simplistic binaries suffocate thoughtfulness in our already nuance-starved times. Understanding complexity requires an expansive view of the world that is incompatible with fetishizing race to the exclusion of all other variables. The much-touted white–black racial wealth gap is largely skewed by top earners, but that’s lost when class is disregarded. Similarly, the World Socialist Web Site, alongside leading U.S. historians, tore apart the anti-racist foundations of The 1619 Project for overlooking immigration and class. Despite these bipartisan criticisms, the project won a Pulitzer Prize, and now certain schools are seeking to incorporate its approach within their curricula.

 

However, such intellectual uncertainty is elided or swept entirely under the rug of the malfunctioning intellectual Mad Libs we find ourselves in today. An imperceptible and hegemonic white supremacy will suffice for every blank. The truth requires no further investigation; no more stones need turning.

 

Counter to the aphorism that reminds us, “It is the mark of an educated mind to entertain a thought without accepting it,” today we are encouraged not to strain ourselves with all that excessive thinking. Inquiry, thought, and dissent are castigated as “white fragility” by prominent anti-racist scholar Robin DiAngelo. Unfortunately, when we are encouraged to differentiate the world solely based on skin color, viewing strangers through the rudimentary prism of racial categories, intricacy is lost. Complexity requires heterodoxy, not the Orwellian groupthink found in Richards’s classroom.

 

Thank God I left academia.

The Pathetic Republican Surrender

National Review Online

Friday, July 30, 2021

 

On Wednesday night, senators voted to move forward on a piece of legislation that does not exist yet, driven by an artificial timeline. Democrats hope that they can quickly agree on trillions of dollars in new spending for a sweeping economic and social-welfare agenda as soon as possible so they can get on with their August vacations. In some sense, this is consistent with the way Congress has operated in recent times. But what’s unique is that it was not just Democrats who were dispensing with good governance to jam through their agenda. Instead, Democrats were joined by 17 Republicans (including Senate Minority Leader Mitch McConnell), who aided and abetted this reckless act.

 

The decision of Republicans to collaborate with Democrats is both bad policy and makes little sense politically. As we have been saying for months, despite what the media (and evidently, some Republicans) will tell you, America’s infrastructure is not crumbling and is not deeply in need of repair. There is not an economic justification to spend money to stimulate an economy that will recover on its own as the nation emerges from the pandemic (growth accelerated at an annual rate of 6.5 percent in the second quarter, the Bureau of Economic Analysis announced on Thursday). Also, it is not as if the government is in the black. The Biden administration’s own estimates foresee debt as a share of the economy surpassing the World War II record this year. And Fed chairman Jerome Powell, who had been insisting that inflation is going to be transitory, has conceded that it will take longer to abate than he previously expected.

 

The myth that the group of Republican negotiators has been helping to perpetuate is that there are two completely separate pieces of legislation under consideration: One, a $550 billion bipartisan plan that focuses on traditional infrastructure; and two, a $3.5 trillion social-welfare bill that includes a host of liberal priorities — subsidized college and child care, expansion of Medicare and Medicaid, elements of the Green New Deal, and perhaps even immigration amnesty.

 

In reality, the two bills are clearly linked. Biden has said so. Senate Majority Leader Chuck Schumer has communicated this by moving both bills on parallel tracks. And Speaker Nancy Pelosi has said she would not even bring the bipartisan bill to the House floor for a vote unless the Senate passes both bills.

 

Some Republicans have been arguing that if they vote for a bipartisan bill, then it will encourage centrist Democrats, Senators Joe Manchin and Kyrsten Sinema, to walk away from the bigger bill. But it is far from clear that swallowing an additional $550 billion in spending will decrease the overall price tag of the combined bills. Manchin this week said, “If the bipartisan infrastructure bill falls apart, everything falls apart.” To Manchin, it’s important that the process has bipartisan cover. It also may make things easier for him to vote for spending in chunks rather than in one big bill. His comments indicate that by working to get the smaller bill across the finish line, Republicans are making Schumer’s job easier.

 

It is true that Sinema came out in opposition to the Democrats-only $3.5 trillion reconciliation bill after the announcement of the bipartisan deal. However, reading her statement, it’s clear that she isn’t opposed to the idea of passing a bigger bill, just the price tag. That still leaves her open to voting for a less expensive, but still extravagant, reconciliation bill. And because Republicans are helping to pass $550 billion of spending, it means Democrats might be able to get more overall spending than if they were to try to hold a purely partisan vote on one massive bill.

 

Republican defenders of the bipartisan deal also argue that it’s fiscally responsible because it will be fully “paid for.” Yet some of the supposed pay-fors are dubious. For instance, according to a summary of the compromise, they are counting on $56 billion from economic growth being generated from the infrastructure spending and an unspecified amount of money from recouping enhanced unemployment benefits that were fraudulently paid out during the pandemic. There is also some hocus-pocus that takes advantage of baseline budgeting.

 

One example is the decision to delay the implementation of a Trump-era rule meant to reform Medicare’s prescription-drug program. If implemented, the Congressional Budget Office determined that it would increase the cost to the government for a number of reasons, including that it would reduce the price of drugs paid by Medicare recipients at the counter, thus making them more likely to take advantage of their drug benefits. But Biden had already delayed the rule until 2023, so it is not currently increasing Medicare spending. The $49 billion in “savings” from delaying it further is entirely vaporous.

 

Beyond this, it’s important to keep in mind that in a time of unprecedented debt, new spending adds to our fiscal obligations whether or not it’s “paid for.” Any money saved to pay for new infrastructure projects is money that could have otherwise been used to finance our existing obligations. For instance, the bill claims $260 billion in savings from repurposing unused COVID-relief money and unemployment benefits. If there’s that much extra COVID money floating around, it would be more responsible to return that amount to the Treasury.

 

Some Republicans don’t want to be seen as obstructionists. But politically, obstructing the majority party’s agenda has never hurt the minority party. Quite the contrary. Democrats benefited from obstruction in the 2006 and 2018 midterms, and it was crucial to Republican waves in 2010 and 2014. Giving Biden a big bipartisan victory is throwing him a life raft at a time when his presidency is being hurt by rising crime, increasing inflation, and the reemergence of COVID-19 restrictions.

 

As we write, there is still a glimmer of hope. To pass the Senate, the bipartisan plan will still have to overcome at least one more 60-vote threshold. Republicans still have time to come to their senses and reject this costly and unnecessary piece of legislation that will backfire politically. We urge them to do so.

The President Who Wasn’t There

By Rich Lowry

Friday, July 30, 2021

 

Teddy Roosevelt fervently believed that the president of the United States should be at the center of the political universe, constantly attracting attention to himself. But he’d never met Joe Biden.

 

Biden is the most powerful man in the world and yet makes almost no impression.

 

No one, besides political and media professionals, wonders what Biden is going to say about something or considers him a figure of fascination. In fact, he barely rates.

 

His recent CNN town hall was a fizzle, averaging only 1.5 million viewers. Fox News easily beat it with its regular programming, and MSNBC had more viewers as well, dumping the president of the United States into third place in the cable-ratings race.

 

He’s underperformed in more formal settings, too. Biden drew 27 million for his first address to a joint session of Congress, whereas Donald Trump drew 48 million.

 

The contrast with Trump’s constant, impossible-to-ignore cocktail of provocation and political melodrama naturally makes Biden’s approach even more stark.

 

He’s the Olympic badminton competition after a WWE match; he’s elevator music after a heavy-metal concert; he’s the sparkler after a fireworks display.

 

Biden’s presidency is, in this sense, practically premodern, almost hearkening back to the pre-mass-media days when presidents were neither seen nor heard.

 

Of course, this is in part a deliberate choice by the White House, playing on the contrast with Trump and limiting Biden’s exposure to avoid distractions (and gaffes). Being a notably low-voltage political figure worked for Biden in his comeback primary campaign and in the 2020 general election, so why not as president?

 

As a result, Biden strangely doesn’t feel like the main event of the Biden years.

 

Assuming it arrives, the midterm backlash in 2022 won’t be directly about Biden; instead, it will be driven by the Biden-adjacent issues of the border, crime, critical race theory and — if they reemerge in force — mask mandates and school closures.

 

Biden is the prime driver of only one of these issues, the crisis at the border that easily could have been avoided by keeping in place the policies the Trump administration had implemented to get control of the flow of migrants.

 

In other words, the culture war itself is likely to be the overwhelming issue in 2022, rather than the president.

 

This would be a marked departure from the midterm shellackings that Bill Clinton and Barack Obama got in 1994 and 2010, which were deeply personal rejections of both men (Clinton was viewed as a draft dodger unworthy of the office and Obama as a crypto-socialist hostile to American exceptionalism).

 

If no one on the right is enamored of Biden, very little energy is invested in opposing him as such.

 

Indeed, the idea that tends to generate the most interest and passion isn’t that the president is assiduously working to destroy the American way of life so much as Biden — whose verbal meanderings can be truly bizarre — isn’t in charge at all.

 

Given the alternatives, this probably works in Biden’s favor. He is pushing a truly radical spending agenda that would, if championed by a more in-your-face progressive president rather than someone who feels like a caretaker, surely be met with much fiercer resistance.

 

But there are risks to Biden, too. If his spending agenda founders, it’s not clear what comes next. Even if the White House decides to try to “unleash” Biden, he is not well-suited to rallying the country or driving an agenda.

His low-intensity presidency may, as his advisers hope, help create a sense of a return to normality in Washington, but it easily could be consistent with a disturbing sense of drift. Usually, the dynamic of the presidency is if you don’t seem to be in control of events, they are in control of you.

 

The test for Biden will be if, eventually, he has to be the dominant public figure in his own presidency.

Thursday, July 29, 2021

The BLM Effect

By Tom Cotton

Thursday, July 29, 2021

 

Last year, our nation experienced the largest single-year increase in murder in American history and endured some of the worst riots in a generation. It’s no coincidence that this appalling death and destruction surged at the same time as the virulently anti-law-enforcement “Black Lives Matter” movement became more popular, powerful, and pervasive. The consequences of the “BLM Effect” continue today.

 

The current crime wave has many similarities to the infamous “Ferguson Effect” that gripped our nation after Officer Darren Wilson justifiably shot Michael Brown in Ferguson, Mo., in August 2014. Anti-police agitators at the time started the “hands up, don’t shoot” myth and created the original Black Lives Matter organization soon thereafter. This group, founded on a lie, condemned proactive policing, argued for a radical reduction of the prison population, and championed a “de-militarization” of police departments. Its most enduring contribution to the public debate, however, was the libel that our men and women in blue are racist and target Americans based on the color of their skin.

 

The media and progressive politicians, including former President Obama, fueled this anti-cop movement. A toxic distrust of the police soon permeated the U.S. Department of Justice and many mayors’ offices across the country. Police suffered withering criticism, widespread civilian resentment, and ever-intensifying scrutiny. Fearing for their jobs and facing demands for leniency, some officers pared back proactive law enforcement, while other officers were actually prohibited from doing their jobs.

 

Where police withdrew, violent crime surged. After Michael Brown’s death, arrests in St. Louis plummeted by over 30 percent and murder rose 47 percent. St. Louis’s police chief, Sam Dodson, soon labeled this de-policing phenomenon the “Ferguson Effect.” Enforcement plummeted in other major cities as well, with overall arrests dropping by 15 percent in New York City and 33 percent in Baltimore by the fall 2015. Between 2014 and 2016, de-policing and associated policies resulted in the largest two-year increase in murder in half a century — and a 31 percent rise in murder in our major cities.

 

The BLM Effect caused an even more shocking drop in policing, paired with a stunning rise in murder. From last summer to this winter, police in Chicago made 53 percent fewer arrests compared with the same period in 2019. Murder in the city rose by 65 percent. In New York, police made 38 percent fewer arrests and murder rose by 58 percent. In Louisville, Ky., police made 35 percent fewer arrests and murder rose by 87 percent. In Minneapolis, Minn., police made 42 percent fewer arrests and murder rose by 64 percent.

The BLM Effect shares similarities with the Ferguson Effect, but is distinct in important ways, particularly in severity and extent of damage. Between 2014 and 2016, murder nationwide rose 23 percent. In 2020 alone, murder increased by more than 25 percent. In essence, the BLM Effect unleashed more death in a single year than two years of the Ferguson Effect.

 

Rioting and looting also surpassed Ferguson Effect numbers. From 2014 to 2016, political violence was largely isolated in a handful of cities and limited in duration. The Ferguson and Baltimore riots of 2014 and 2015, which were the most destructive of that period, generated less than $50 million in property damage combined. In 2020, hundreds of riots broke out nationwide, wounding over 2,000 officers, and inflicting nearly $2 billion worth of property damage. In Portland, Ore., rioters and anarchists took to the streets for more than 100 days in a row. The 2020 BLM riots were the most destructive in U.S. history.

 

Years of institutional decay and the maturation of a poisonous ideology maximized the destruction of the BLM Effect. The activists and BLM supporters of the Ferguson era are now mayors, district attorneys, and state’s attorneys in cities where the BLM Effect is taking the greatest toll — and they are partnering with new activists who are even more radical than they were.

 

When the Ferguson crime wave started, most major cities had enduring respect for law and order, with policies dating back to the crime crackdown of the 1990s. Michael Bloomberg had only recently left office in New York City, and Rahm Emanuel reigned in Chicago. For all their faults, these were not virulently anti-law-enforcement leaders. Emanuel worried that the Ferguson Effect rendered his officers “fetal,” while Bloomberg had continued most of the pro-police policies of the Giuliani era. The years of pro-enforcement policies and advocacy before the Ferguson Effect provided an institutional bulwark against violence and deterioration of order.

 

By 2020, however, years of cynicism, anti-cop reforms, and a new generation of dangerous demagogues had weakened the rule of law. The most anti-NYPD mayor in New York City history, Bill de Blasio, was in his seventh year in office, cop-hater Lori Lightfoot was in power in Chicago, and other progressive mayors took office in towns and cities nationwide. So-called “progressive prosecutors” soon followed and now refuse to charge misdemeanor offenders and avoid serious sentences for career criminals. As city halls abandoned or even opposed their own police forces in 2020, violent rioters and common criminals alike were emboldened.

 

It wasn’t just municipal leaders who radicalized in the years after the Ferguson crime wave; BLM activists and their allies also became more extreme. Demands for defunding and abolishing entire police departments replaced calls for de-militarization and reform. Last year, BLM activists also worked alongside the domestic-terrorist group Antifa to engage in widespread attacks against police precincts and federal courthouses, an audacity not seen during the Ferguson era.

 

A vice grip of extremism both from the bottom-up and top-down resulted in 20 major cities defunding their police departments, the removal of police from schools in many municipalities, and even the attempted abolition of the Minneapolis Police Department. New York City alone shifted nearly $1 billion from the NYPD, and Los Angeles cut funding to its police department by $150 million. Both cities also disbanded specialized units focused on combatting violent crime. Several cities also limited the use of non-lethal crowd control tools such as tear gas, pepper spray, and rubber bullets, forcing officers into ever-more-dangerous situations.

 

Even officers who weathered the Ferguson era have now decided to turn in their badges and hang up their uniforms, and it’s hard to blame them. Between April 2020 and April 2021, nationwide police retirements rose 45 percent while resignations increased by 18 percent. In New York City, home to one of the best police forces in the world, police retirements surged a stunning 72 percent last year. Last month, the entire Portland Rapid Response Team resigned en masse, and more than 10 percent of the Portland Police Bureau left the force in the nine months after the rioting started.

 

Recruitment of officers has similarly suffered, with 86 percent of police chiefs reporting that they are short-staffed. Between April 2020 and April 2021, as crime and retirements surged, the rate of hiring in mid-sized departments dropped 29 percent and plummeted 36 percent in large departments. The flood of officers out of larger departments, paired with the trickle of recruits, will exacerbate the ongoing crime disaster.

 

The institutional wreckage wrought by the BLM Effect is a symptom of a deeper ideological cancer. In the days to come, we must work to rebuke the radical anti-cop ideology that emerged in 2014–2016, metastasized from 2017–2019, and became debilitating in 2020. If the ideas that undergird the movement remain unchallenged, the BLM Effect will continue.

 

Policing is indeed one of the greatest civil-rights issues of our time. Weak policing, weak prosecuting, and weak sentencing hurts black Americans more than any other group of our citizens. African Americans tragically constitute approximately half of all murder victims and regularly suffer the brunt of damage resulting from riots. Their lives matter.

 

The fair-weather protesters, who so fondly decry systemic racism, fail to see a cruel irony. If, as they claim, racist policy is defined solely by racially disparate outcomes, then their weak-on-crime proposals are in fact breathtakingly racist. When it comes to the morality of the rule of law, we should never take lectures from those who coddle criminals.