Thursday, December 31, 2020

The Price of COVID Relief

By Kevin D. Williamson

Wednesday, December 30, 2020

 

Why $2,000 relief checks? Why not $20,000 relief checks? Why not $200,000? Why not $2 million?

 

“Oh, don’t be ridiculous,” comes the response. “Nobody is talking about that.”

 

Okay, but why not?

 

The coronavirus epidemic is an extraordinary situation in which extraordinary measures are appropriate. And the U.S. government has taken extraordinary measures, notably in the form of the $2.2 trillion CARES Act, an emergency-relief/economic-stimulus package that included some shotgun blasts ($300 billion in one-time cash payments handed out to households with scant regard for need), a useful and proper extension of unemployment benefits, loans made to support business payrolls through the Paycheck Protection Program, and some questionable wheel grease (nearly $1 trillion, almost half of the original bill’s spending, went to subsidized loans to corporations and state and local governments, a $10 billion line of credit for the Postal Service, special grants here and there, a nice arrangement for defense contractors, and so on). Because some of the money will come back in the form of loan repayments, the package will add — only — $1.7 trillion to the national debt, according to preliminary Congressional Budget Office estimates. But there’s other spending to consider, too, including the half-trillion-dollar Paycheck Protection and Health Care Enhancement Act, etc.

 

It matters how much we spend. It also matters what we spend it on, and how we go about spending it.

 

Sending checks willy-nilly to householders from sea to shining sea is politically popular for obvious reasons: People like getting checks, and politicians don’t mind writing them on other people’s accounts — “bribing the public with the public’s money,” as Alexis de Tocqueville famously put it.

 

The United States already has many programs for people suffering from serious economic privation, ranging from food support to housing support to health-care support. Some of these welfare programs are better-designed and better-run than others, and many would be better administered at the state or municipal level, but that is not an indictment of the programs as such. The United States also has epidemic-specific measures in place, which is entirely appropriate, and the most important of these is providing extended/enhanced unemployment benefits. The main economic effect of the epidemic has been preventing people from working when businesses were appropriately obliged to suspend operations or operate at some reduced capacity, which has cost both employees and business owners (let’s not forget them) a great deal of income.

 

The way to respond to that is not firing money at the general public from a confetti cannon.

 

President Donald Trump, the master negotiator, was busy sulking while Congress and his treasury secretary were putting together the latest relief bill, which is to include $600 stimulus checks for most Americans. Democrats said they wanted $2,000 checks. Trump, who was very excited by the prospect of putting his own signature on stimulus checks the last time around, roused himself and chimed in to declare that he was with the Democrats, demanding $2,000 checks. Senator Mitch McConnell (R., Ky.), the majority leader, perhaps muttering quietly to himself, began sharpening his knife, and the $2,000 checks are not expected to survive his attention.

 

Frédéric Bastiat, the great French economist, described government as “that great fiction through which everybody endeavors to live at the expense of everybody else.” The sentiment may be cynical, but it is not incorrect. In the matter of sending checks more or less indiscriminately to American households, very little attention is given to the question of whose account is being dinged to pay for them.

 

Our government is operating in deficit right now and has been for some time. That isn’t necessarily a problem: A government like ours with an economy like ours can carry a fair bit of debt without too much trouble, and, at times, it makes more sense to borrow than to tax. But debts finally have to be paid. That means that spending has to be funded, either with taxes today or with taxes tomorrow to cover principal and interest. Interest rates are very low at the moment, which makes borrowing attractive. Will they stay low forever? There isn’t any reason to think that they will, and to pretend that there is no risk of their rising is irresponsible.

 

Wishful thinking is not going to change that. But wishful thinking is always with us, from the deathless Republican legend of self-financing tax cuts to the recent leftist vogue for “Modern Monetary Theory,” which holds that money, among all commodities, can be magically liberated from the laws of supply and demand as long as the right people have political power. Experience should make us skeptical.

 

Of course we cannot and should not ignore the economic consequences of the epidemic that is still, let’s not forget, raging across our country. From unemployment benefits to emergency support for hospitals, there is much that can and should be done. But we also cannot and should not ignore that this all has to be paid for, at some point. Nor should we ignore that, under the tutelage of Donald Trump, Nancy Pelosi, et al., we are training a generation of Americans to wait by the mailbox for their check from the government.

 

There will be a price to pay for that, too.

Britain’s Brexit Triumph

National Review Online

Thursday, December 31, 2020

 

Four and a half years after the momentous vote in June 2016, Brexit is finally and fully accomplished with a U.K.–EU trade deal that sailed through Parliament 521 to 73.

 

It’s over.

 

The economic uncertainty about the United Kingdom’s “future relationship” with the nascent super-state is finished. The bottom line is that the U.K. will continue trading relatively freely with the European Union, avoiding the economic disruption that would come by falling back on WTO rules in a disorderly exit. Trade will be done through the mechanism of the new trade agreement with agreed-upon provisions for regulations and retaliatory tariffs. Like all sovereign nations, the U.K. can now go about making its own trading arrangements in the world, while keeping faith with its existing covenants.

 

It’s over.

 

The sometimes-excruciating political turmoil that issued from the Brexit vote is also at an end. On inspection the balance of that turmoil — the uncertain votes in the House of Commons, the interference of the House of Lords, and even a usurpation by the supreme court — was due to internal divisions and weakness in the Tory Party, which Prime Minister Theresa May was unable to change or overcome. Boris Johnson’s general-election triumph a year ago gave his government the necessary mandate to finish the job.

 

A major factor in getting a decent trade deal from the EU was Johnson’s willingness to walk away from the negotiating table. As late as December 21, Johnson told Ursula von der Leyen, president of the EU Commission, “I cannot sign this treaty, Ursula, I can’t do something that is not in my country’s interests.” With Johnson, for the first time, EU negotiators understood they were dealing with a leader who had a mandate and the political talent to tell them “no.” In the end, both sides said “yes.”

 

It’s over.

 

The hysteria that accompanied resistance to Brexit is vanquished. Polite opinion denounced Brexit as an irrational act of national self-harm. The campaign against Brexit before and after the vote predicted imminent economic calamity, leaving the country permanently poorer (former chancellor of the Exchequer George Osborne). The United Kingdom could lose access to vital medicines and pharmaceutical companies, and would take a step backward in science (Vox). Trade would stop, and ships would be halted in the ports, while food meant for Britons rotted. The City of London’s place as a world financial capital would be destroyed. Outside the European Union, it would be rash to assume that the United Kingdom would avoid another European War (Prime Minister David Cameron). The Guardian will surely keep us updated on whether its 2019 predictions of post-Brexit “chip shortages,” pogroms against Poles, and mass starvation come about.

 

In the end, the make-or-break question wasn’t about starvation or war, but the number of years that EU fisherman could remain in the waters around Britain. Why? Because the continuation of liberal trade arrangements between the United Kingdom and the European Union makes sense as a matter of economics and practical politics. Europe benefits from funding scientific research in the United Kingdom, which it will continue to do. Onions can and will still be sent from Belgium to Scotland. Mercedes still wants to sell cars to rich French bankers in England, who in turn want to continue doing their financial work in the City of London, which is free of French regulators and has more attractions than Dublin or Frankfurt.

 

It’s over.

 

The illusion that the European Union is an inevitable and irresistible future has been put to rest. The United Kingdom is now free of the burdensome political project of “more Europe.” That means the United Kingdom is not subject to the European Court of Justice, an institution founded (in part) by Nazi jurists such as Hans Peter Ipsen, which privileged “ever closer union” above democracy and the rule of law. It is not subject to a Commission led by political failures, recently dispatched from office by democratic verdicts in their own nations. The United Kingdom is free of its financial contributions, which fund the sprawling eurocrat bureaucracy and which pay for the suppression of political dissent in recalcitrant member states.

 

Brexit was difficult and treacherous. And, its proponents should be honest that it cost more than a few fish. It revealed stark divisions in British life, and aggravated fissures within the Union itself. But now as a free, sovereign, and independent power, the Johnson government is in a better position to repair those divisions, and the people of the United Kingdom are in a better position to hold to account the authorities that govern them.

 

Sovereignty and democracy go together in the modern world. Self-government is not over, and never should be.

Journalists’ Behavior over Luke Letlow’s Passing Is Abhorrent — and Telling

By Ellen Carmichael

Wednesday, December 30, 2020

 

On Tuesday evening, Congressman-elect Luke Letlow’s (R., La.) family announced that he passed away at the age of 41 due to complications from COVID-19. He left behind a young family, including two small children, as well as a vast network of friends in Louisiana and Washington, D.C., all devastated by his passing.

 

Some progressive Twitter activists and left-wing reporters couldn’t wait to begin their grave-dancing. Letlow deserved to die, they mused, because he didn’t take COVID seriously enough. They scoured his online presence to find any proof that he engaged in so-called “denialism.” Some, such as Vox’s Aaron Rupar, pointed to an October video where the then-candidate had the audacity to advocate for reopening the economy while maintaining state and federal precautions on coronavirus. Molly Jong-Fast of The Daily Beast also shared the video. Hundreds of their followers joined in, blaming Letlow for his own death and expressing that he was unworthy of pity because of his politics. For them, his death was further proof that those who dare propose policy prescriptions that differ from their own, no matter how rational or mainstream they may be, just have it coming to them.

 

Setting aside the lack of evidence for their claim that Letlow denied the dangerous realities of coronavirus, the COVID ghouls and scolds clearly see themselves as worthy and qualified judges of their fellow man. It is they who decide whether or not people act appropriately enough to be spared death by coronavirus. As Michael Brendan Dougherty recently put it, they feel empowered to “turn every sick person into either a blameworthy fool or a blameless victim,” an extraordinarily arrogant and inhumane view of human suffering.

 

In no other health circumstance would such brutality toward the afflicted be tolerated. We do not deem individuals who become sick by engaging in known “risky behaviors” — unsafe sex, abuse of alcohol, drug use, poor diet, smoking, dangerous driving — as deserving of pain and misery. So, mocking and haranguing those who become sick or die due to COVID-19, a novel virus from which we cannot possibly shield ourselves entirely, is unconscionable.

 

But for these individuals, any expectations of their own behavior — namely, to not be a terrible person in the face of others’ grief — are secondary to soothing their own anxieties about the coronavirus. Blaming others may help them temporarily make sense of the sickness and death, but it can never provide them lasting relief from the unpleasant uncertainties this virus inflicts on us all.

 

That doesn’t stop their callous campaigns from continuing. Look no further than the death of my former boss, Herman Cain, whose death from COVID-19 complications was touted as proof Republicans denied the risks of the coronavirus (never mind that Cain had a lengthy track record in both speech and practice of taking the virus seriously). These are the same individuals who were downright jubilant when President Trump and many on his team contracted the virus but are seemingly silent about COVID-19 diagnoses of other leaders who also benefited from ample safeguards, such as Letlow’s delegation colleague Congressman Cedric Richmond (D., La.) who contracted coronavirus while campaigning for Democratic Senate candidates in Georgia this month.

 

Even among ordinary people, an individual’s desire to participate in day-to-day activities such as church services and dining out is enough cause to hector him for contracting the coronavirus. Prominent progressives, left-wing activists, and their media allies have routinely contended that if only Americans weren’t so stupid, selfish, and negligent — and in particular, if red-staters could abandon their silly notions of constitutional rights and their incessant desire to keep local businesses open — this pandemic would have been over a long time ago.

 

But, for all the insistence that it is American obstinance that is perpetuating the pandemic, there’s not much evidence for such accusations. We are actually now masking at higher rates than ever before, which has been confirmed by observational studies showing broad compliance in retail establishments by customers and staff. Meanwhile, the TSA reports that since March 15, 2020, the rate of passengers passing through checkpoints is about 25 percent of 2019 totals. Americans have significantly curtailed socializing with others, despite scientists telling the New York Times that the data do not support claims that small gatherings catalyze coronavirus surges. And even as the very real pains of prolonged isolation and widespread depression caused by COVID-19 persist, the vast majority of American families have still greatly altered their holiday traditions by canceling plans, limiting gathering size, enforcing social distancing, and even requiring face coverings. The repeated insinuation that pigheaded Americans have refused to do what it takes to defeat the virus is tone-deaf, cruel, and simply untrue. In reality, we have sacrificed a lot more for a lot longer than anyone thought we could.

 

And when the self-appointed COVID cops aren’t too busy condemning those who have gotten sick, they’re deciding who should be allowed to avoid illness via inoculation. With doses in the very early days of the vaccine rollout limited, they want to forbid Republican lawmakers from getting the inoculation because, as they again claim without evidence, they didn’t take the pandemic seriously. CNN contributor Ana Navarro-Cárdenas and liberal writer Kurt Eichenwald launched indignant tirades against Senator Marco Rubio (R., Fla.) for receiving his COVID-19 vaccine, casting blame on him for health systems not administering the vaccine quickly enough to frontline workers and even for how bad coronavirus has gotten in America.

 

Of course, they found no room to criticize Congresswoman Alexandria Ocasio-Cortez (D., N.Y.), 18 years Rubio’s junior, who received the vaccine at the same time he did. There was also noticeably no outcry about the highest-ranking Democrats, such as President-elect Joe Biden and Vice President-elect Kamala Harris, receiving their vaccines after spending months portraying it as rushed and unsafe in hopes of scoring political points against President Donald Trump during the campaign. For progressives, COVID-19 denialism disqualifying someone from receiving a life-saving vaccine is exclusive to having different approaches to solving problems caused by the pandemic, but mysteriously doesn’t include fear-mongering the one surefire thing that will actually protect people.

 

For the COVID nags, politics, not people, is everything. The pandemic has given them an opportunity to test out long-held policy preferences, including the government financially coercing people to adopt certain behaviors. To them, having a different approach is tantamount to wanting people to die. Wanting to spare your kids the developmental, educational, and social consequences of distance learning means wanting to kill their teachers. Missing the financial stability and personal fulfillment of having your business open? You want to stay in business, and thus, you’re okay with getting people sick. Feeling distressed because you can’t bury your loved one but watch large-scale political demonstrations take place without officials intervening? Stop being selfish.

 

Perhaps most disturbing is the utter lack of qualifications these individuals have to make such judgments. They don’t have the humility to concede lawmakers across the political spectrum are forced to weigh ever-changing public-health guidance with other policy factors when governing. Meanwhile, no serious epidemiologist would ever expect policymaking to be dictated exclusively by health directives, but talking heads are convinced enough of their own expertise to demand they must be. They also seem unwilling to accept that with so many aspects of the virus unknown throughout the pandemic and the efficacy of many of our mitigation efforts still unclear, most lawmakers and the public are doing all they can to prevent any sickness and loss of life. Still, it’s easier to blame someone, and even easier if that someone doesn’t share your political philosophy.

 

They do so at our country’s peril. While one may glean fleeting satisfaction by blaming others for the pain and uncertainty we’re all experiencing, the scars from the scolds will persist long after the pandemic is blessedly behind us. People understand they’re being closely watched and judged, and they’re acutely aware that those who disagree with them will find no room for mercy or compromise — or worse, think them deserving of death because they disagree with their worldview.

 

Instead, we should all try to be kinder and more gracious toward each other. Most people are doing the absolute best they can, often making incredibly tough decisions amid extraordinarily difficult circumstances. Nearly everyone knows the coronavirus is a threat they must take seriously. No one wants people to get sick and die, and it’s time to stop acting as if they do.

Uncovering the Chinese Government’s Pandemic Deception

By Jimmy Quinn

Thursday, December 31, 2020

 

At the outset of the pandemic, as Chinese Communist Party officials at all levels of government failed to provide trustworthy information, the rest of the world underestimated the disease.

 

But those who had witnessed or otherwise studied the Party’s cagey response to the 2003 SARS epidemic knew immediately what they were dealing with. The Taiwanese government sprang into action, crafting its famously successful pandemic response. And in the U.S., one Trump official — a former journalist who while based in China challenged the Chinese government’s SARS-era deceptions — started warning about the virus well before many realized that an outbreak could occur in the United States. People like him understood the lengths to which the Party-state would go to cover up the emergence of a deadly disease, a tendency that persists to this day even though its aggressive measures have brought the virus down from its early 2020 peak.

 

New research led by Weifeng Zhong, a Mercatus Center senior research fellow, sheds light on Beijing’s public-health deceptions. His team is using an innovative technology that can also examine changes in the Chinese government’s policies.

 

“We all know that the official number of diagnosed cases coming up from the Chinese authorities was not reliable,” Zhong told National Review in an interview following the release of his latest research paper in December. “But a quantitative question is, How unreliable are those numbers?”

 

Many have attempted to answer the same question. Some turned to reports about increased cremations in Wuhan, which suggested that official case numbers in early 2020 were being deliberately suppressed by the authorities. On-the-ground reports from Xinjiang shed light on an outbreak there that took place later in the year and that Chinese officials had downplayed.

 

But Zhong’s research — based on machine-learning analysis of articles that appear in the People’s Daily, the flagship Party propaganda outlet — offers an inventive way “to analyze the Chinese government’s own words.” It’s called the Policy Change Index (PCI).

 

“The reason why that actually would work is because words are very indicative of intentions in terms of the Chinese government’s policymaking process,” Zhong said. The idea behind the PCI is that, if the tone used by major state newspapers during the COVID pandemic sounds as urgent as the tone used by the same publications during the peaks of a previous disease outbreak, that could call into question the government’s claim of low case counts.

 

To construct this model, Zhong relied on a historical example — in this case, SARS — that resembles the rhetoric that the Party is using in COVID times. The algorithm analyzes recent People’s Daily articles to compare the tone expressed with articles published during the SARS outbreak nearly two decades ago.

 

Effectively, this allows the researchers to map a 2020 article onto a day in 2003 — and in the process, to match a COVID-era article with a point on the SARS epidemic trajectory. Articles similar in tone to those published at a particular time during the SARS era suggest that COVID would have been at a similar point at the time of publication. Zhong writes in his paper, for instance, that an article from COVID’s February peak in China resembles the urgent tone the People’s Daily used in March 2003, when SARS was cresting.

 

Even in a country transparently reporting the incidence of disease, the number of diagnosed cases might not align with the true extent of infection. Serological studies in several countries have shown that true infection rates are higher than what diagnostic nasal-swab and spit tests suggested (a recent Chinese CDC report out of Wuhan suggests that actual case counts were ten times higher than what was initially reported). But U.S. intelligence assessments have also shown that the Chinese authorities deliberately undercounted cases.

 

Zhong’s research suggests a similar conclusion. Whereas officially reported Chinese COVID numbers reached their peak in mid-February and declined steadily from there, Zhong’s outbreak index declines at a much slower rate, showing that officials were still conveying their sense of urgency through the state-run press. As the officially reported number of cases plummeted, the People’s Daily used language that’s associated with higher levels of infection.

 

Although some amount of the underreporting can be attributed to officials acting at the local level, state-media articles also suggest that underreporting by higher-ups was most pronounced during regional spikes that followed the one centered in Wuhan. Beijing was placed under a lockdown order in June, well after the initial outbreak had been suppressed. Although official case counts remained low in the capital city, Zhong notes that the People’s Daily sharply emphasized the importance of the lockdown measures “in marked contrast to the numbers, which indicate only about a dozen new cases per day in a city with a population of over 20 million.”

 

A July outbreak in Xinjiang — the far western region where Party officials have constructed a brutal techno-authoritarian ethnostate — barely registered in reporting of cases. But conditions in that region reflected a different reality, including strict lockdown measures much like the ones implemented in Wuhan and an attendant rhetoric of “wartime mode.” Zhong’s index picked that up, too.

 

To be sure, this method has its limits. It cannot offer a precise estimate of the true number of cases that China has experienced. And much can change over the course of 17 years, so it’s conceivable that official responses to SARS and COVID might not use exactly the same language.

 

The index, though, does offer a window into how officials have thought about COVID, how it compares with a previous disease outbreak, and, therefore, what they’re hiding from the world through distorted case reporting.

 

It has other applications, too, particularly in scrutinizing propaganda for insights about what the Party will do in the future. Zhong has constructed a similar index, called the PCI Crackdown, that uses the timeline of the Tiananmen Square massacre to analyze Beijing’s crackdown in Hong Kong.

 

That comparison might sound tenuous. But his team found some key parallels in how the government’s language in describing the demonstrators — first “people who have a good heart,” then, eventually, “traitors” — evolved. “We were surprised ourselves, too,” Zhong said. And although the researchers did not specifically predict the National Security Law that since May has been wielded to kill the pro-democracy movement, they detected key changes in the Party’s use of language in April.

 

So what’s on the horizon? Zhong told National Review that another one of his models — which analyzes the front page of the People’s Daily for accelerating changes in the way that Chinese officials talk about a range of issues — has picked up an alarming trend. “In the second quarter of this year, the Chinese government is showing an unusual emphasis on military aggression on the front page of the People’s Daily.”

 

Of course, the index notes changes only in what Party officials are communicating to the public, so this is just one data point among many others; it says little about the nature of potential military actions or the location of their targets. But Zhong says that he has observed increased Chinese social-media chatter about invading Taiwan to absorb its semiconductor production.

 

“So that’s something I think people should pay more attention to, especially in light of the fact that China is thinking about military power in a more salient way,” Zhong said. He’s far from the first to warn about the precarity of Taiwan’s security, and his Policy Change Index is an approximate measure. Still, the insights afforded by his method don’t seem far from the truth — in fact, they cut straight through the CCP’s lies.

Wednesday, December 30, 2020

How a Vindictive Classmate and a Cowardly University Ruined a Girl’s Life

By Isaac Schorr

Wednesday, December 30, 2020

 

A Racial Slur, a Viral Video, and a Reckoning

 

That’s how the New York Times headlined its hit piece on a college freshman for something she had said as a high school freshman. Mimi Groves was still a child when she said, in a Snapchat recording, “I can drive” followed by the “n-word” — the racial slur.

 

Jimmy Galligan, a half-black student who graduated from Heritage High School in Virginia this past spring with Groves, obtained this video during their senior year. Per Galligan himself, he waited until Groves had been accepted to, and chose to enroll at, the University of Tennessee-Knoxville to release the video — which went viral.

 

The resulting firestorm led to a torrent of abuse, and to an ultimatum from the University of Tennessee to Groves: withdraw voluntarily or have your offer of admission rescinded. Groves, who is white, chose the former and is now taking courses at a local community college instead of at her dream school — the reckoning.

 

Should the former two have led to the third on the scale that Groves is now facing? Any reasonable person would say no. Even conceding the obvious — she shouldn’t have used that slur in any context — there’s little indication she used it out of hatred for black people. In fact, the context seems clear: Groves said it casually, as hundreds of hip-hop tracks do every year. That doesn’t excuse the behavior, which should be considered unacceptable. But it is an important distinction from using the slur with animus, which was obviously not her intention.

 

There are many to blame for what’s happened. If Groves can be held responsible for a poor decision rendered in her mid teens, surely Galligan can be as well for deliberately trying to ruin a classmate’s life four years later — a worse crime at a more mature age. But regardless of Galligan’s culpability, institutions such as the University of Tennessee and the New York Times are far more deserving of scorn than either of these Virginia teens.

 

At the university, cowardice won the day. Facing calls on social media for Groves’s acceptance to be rescinded, administrators bowed to pressure from a vocal minority, forgoing what was right to do what was most convenient. It was easier for university officials to hang Groves out to dry than to withstand the intense but brief storm themselves. So that’s what they did.

 

Their decision had nothing to do with racial or any other kind of justice. They didn’t care if Groves would feel “comfortable on campus” — language they used to persuade her to withdraw prior to handing her the ultimatum — and they didn’t honestly believe that black students on campus would be at risk were she to enroll. The only thing that mattered to them was escaping the situation with as little effort and scrutiny as possible. Forget taking a stand and explaining why they wouldn’t punish a young woman for a mistake she made as a child. It was all about damage control. I wonder how many of us would ultimately qualify for acceptance to the University of Tennessee were we held to the same standard as Mimi Groves from our freshman years of high school onward.

 

And at the Times, disgraceful (yet now familiar) behavior also won out. To signal approval of Galligan’s behavior to readers without outright endorsing it, Dan Levin, the article’s author, notes that Galligan had “made a decision that would ricochet across Leesburg, Va., a town named for an ancestor of the Confederate general Robert E. Lee and whose school system had fought an order to desegregate for more than a decade after the Supreme Court’s landmark ruling.” The ridiculous implication is that the name of Groves’s town and its opposition to integration over 50 years ago justified her treatment.

 

Levin adds that “the story behind the backlash also reveals a more complex portrait of behavior that for generations had gone unchecked in schools in one of the nation’s wealthiest counties, where Black students said they had long been subjected to ridicule” before going on to share the stories of students who were forced to endure appalling racist treatment by their classmates or even have “Underground Railroad” games forced on them in gym class. As maddening as these stories are, they describe people guilty of far worse than Groves’s offense. Levin’s attempt to blur the lines between her case and more damning ones is contemptible — or worse.

 

Levin also records an anecdote from Galligan that helpfully illuminates just how wrong what Galligan did was:

 

Mr. Galligan thinks a lot about race, and the implications of racial slurs. He said his father was often the only white person at maternal family gatherings, where “the N-word is a term that is thrown around sometimes” by Black relatives. A few years ago, he said his father said it aloud, prompting Mr. Galligan and his sister to quietly take him aside and explain that it was unacceptable, even when joking around.

 

Just a few paragraphs later:

 

For his role, Mr. Galligan said he had no regrets. “If I never posted that video, nothing would have ever happened,” he said. And because the internet never forgets, the clip will always be available to watch.

 

“I’m going to remind myself, you started something,” he said with satisfaction. “You taught someone a lesson.”

 

For his father, Galligan calmly explained why using slurs — even casually — is wrong. For Groves, he summoned national opprobrium on her and her family and denied her the opportunity to attend her dream school. Those that are so unforgiving as to seek this retribution, so cowardly as to grant it, and so dishonest as to excuse it are broken.

If Biden Wants to Heal the Nation, He Should Make the Presidency Small Again

By Kevin D. Williamson

Saturday, December 26, 2020

 

Joe Biden says he wants to “heal America” as president. The problem for Biden is that the president and, perhaps more important, the presidency thrive on crisis. It is wars and other national emergencies (real and imagined) that have facilitated the radical expansion of the executive office from FDR onward. Keeping the nation in a state of crisis is good for presidents — and good for their hangers-on, who feed parasitically on the swollen executive in chief.

 

Biden comes into office in an age of big presidents, Barack Obama and Donald Trump among them. But he also comes into the presidency after having spent nearly 40 years in the Senate. If he truly wants to heal the nation, cooperation and consensus should be at the center of his agenda, as they should be central to everybody else’s approach, too: Bipartisanship and consensus are not sentimental feel-good virtues — they are necessary to creating stable public policy and the prosperity that rests on that stability. That doesn’t mean pretending that our disagreements are not disagreements; it means not treating our disagreements as civil war.

 

Rather than follow the worst instincts of his executive predecessors, Biden should pay some attention to this week’s rare moment of deep agreement between two very different legislators: Rep. Alexandria Ocasio-Cortez (D-NY) and Sen. Mike Lee (R-Utah).

 

Rep. Ocasio-Cortez complained loudly that legislators were given a few hours to read the recent coronavirus-relief bill, which runs 5,593 pages and is the second-largest spending measure ever passed by Congress. Sen. Lee voiced his agreement and expanded in a video, describing how it has come to pass that Congress has forwarded a bill read in its entirety by none of its members and stitched together by a small number of leaders, a half dozen or so, who expect their colleagues to simply proceed on faith and loyalty. When the Bronx socialist and the Utah Republican come down on the same side of the question, it’s time to pay attention.

 

Biden, who has long experience in the legislative branch, could use his clout as president to push for a return to the steady and stable if plodding and irritating process of making law through the lawmaking process rather than allowing the constant threat of government shutdowns or the blockage of genuine emergency measures to keep the US government in a state of politically induced crisis. Ocasio-Cortez was right to observe that the current model isn’t legislating, but hostage-taking.

 

The root problem in Congress and the root problem of the presidency are the same problem: The US government has been operating in a semi-permanent state of emergency for decades. Congress has abandoned “regular order” — the committee-based process by which the House and the Senate adopt a budget and produce a series of appropriations bills, working out their differences in conference before sending legislation to the president. In its place we have had a long series of emergency measures, a continuing resolution here and a Frankenstein omnibus bill there, in a process dominated by congressional leadership and, hence, by partisanship. This is a near guarantee of polarization and short-term thinking.

 

If Biden wants to heal the nation, then the answer isn’t a heroic presidency elevated to the state of a national priest-king, as though the Oval Office were the Chrysanthemum Throne. The answer is a smaller presidency led by a smaller president, who allows our legislature to do their jobs in the way that our Founders intended.

 

And Joe Biden, that blessed mediocrity, may be just the right man for that job.

New Alcohol Guidelines a Victory for Science over Politics

By Michelle Minton

Wednesday, December 30, 2020

 

It may be impolite to talk politics at the dinner table, but when it comes to government advice on what we eat, political agendas are usually baked in. Updated every five years, the official Dietary Guidelines for Americans have long been the subject of intense lobbying, with special interests vying for recommendations that favor their respective industry or point of view. The latest iteration, finalized by the government this week, was no different.

 

But, this time around, evidence won out over political agendas.

 

Back in July, the Dietary Guidelines Advisory panel, a group of 20 experts selected by the government to review the evidence, suggested the guidelines substantially cut recommended levels of sugar and alcohol intake. That triggered backlash from the food and beverage industry. But one recommendation in particular provoked an outcry from the scientific community: to halve the limit on alcohol intake for men.

 

For five decades or more, the scientific literature on alcohol consistently showed a strong link between moderate intake and better health outcomes compared with those who totally abstain from alcohol or those who binge drink. As such, government alcohol advice since the 1990s has recommended women have no more than one drink a day and men have no more than two drinks daily.

 

Yet the government’s advisory panel recommended the limit for men be reduced to just one drink a day. Such a change implies that the evidence about alcohol intake — on which the panel supposedly based such recommendations — has dramatically shifted in recent years. But that simply isn’t the case. The suggested changes weren’t based on evidence at all but, rather, the apparent agenda of certain members of the panel.

 

For some years, a growing contingent of activist-academics has set out to convince the world that there is no level of safe alcohol intake, no matter what the science says. Thus, even while the expert panel’s report conceded the evidence showing moderate alcohol consumption is associated with lower mortality than total abstinence and found just one study comparing the risks of one drink a day versus two drinks, it still recommended halving the upper limit on alcohol intake for men.

 

One study is hardly evidence enough to make such drastic changes to national guidelines, but as several prominent experts pointed out, evidence had little to do with the decision. Six Harvard researchers, half of whom previously served as guidelines advisers, wrote in a letter to the government that the current panel appeared to have cherry-picked evidence to “support a pre-determined and, in our view, unscientific conclusion.” H. Westley Clark, former director of the Substance Abuse and Mental Health Services Administration, put it more bluntly, noting that the guidelines “should not be a sleight of hand vehicle for Prohibition.”

 

It is unlikely most of the panel members want outright prohibition, but as noted by their report, the hope is that it will influence public policy that will lead to changes in consumption. Indeed, while most Americans may simply ignore the Dietary Guidelines, they have a significant impact on how government approaches dietary issues. They inform how government regulates sales, promotion, and taxation, and how it chooses what type of research to fund. For activists hoping to convince officials of the importance of addressing Americans’ alcohol consumption and even funding their research on the topic, changing the guidelines is a critical first step.

 

Luckily, those at USDA and HHS who make the final decision on the Dietary Guidelines saw through the advisory panel’s political agenda, rejected the recommended changes, and chose to preserve the Obama-era alcohol-intake guidelines. Some have predictably responded to this by accusing the Trump administration of bowing to alcohol-industry influence. But, as Brandon Lipps, deputy undersecretary for food, nutrition, and consumer services at the USDA, said, the new limits proposed by the advisory panel simply did not meet a “preponderance of the evidence.” In this one instance, at least, science trumped politics.

Tuesday, December 29, 2020

‘Scary’ Monsters

By Kevin D. Williamson

Tuesday, December 29, 2020

 

One of the words I would abolish from our political lexicon is “scary.” It is an insipid, empty adjective with its roots in “one weird trick”–style digital gimmickry, beloved of such master click-baiters as the editors over at Vox. A recent example comes from our friends (“If a man’s character is to be abused, there’s nobody like a relation to do the business”) over at The Bulwark, which carried a headline reading: “The Scary Spectacle of Trump’s Last Month in Office.”

 

(The piece, by Brian Karem, opens: “Some may think of these as ‘the last days of Pompeii.’ If that reference strikes you as too erudite to be fitting, you might prefer to think of the month ahead as ‘the last days of chaos in a blender.’” To borrow from Margaret Thatcher: If you have to tell people you’re erudite. . . . And The Last Days of Pompeii was inescapable as a miniseries on ABC — as allusions go, not exactly Finnegans Wake.)

 

“Scary” used in this way is irritating for a half a dozen reasons. One of them is that it is a base-stealing stratagem, a way of suggesting, usually in a headline, that the following matter is shocking or revelatory. And what follows almost always is something that is neither shocking nor revelatory. In the Bulwark piece, the “scary” headline is undercut by the copy itself: “The final days of the Donald Trump administration are upon us, and they look much like every other day at the White House for the last four years.” To which some might reply: “Oh, but every other day at the White House for the last four years has been scary, too!”

 

In which case, grow the . . . heck . . . up.

 

Scary is a weak and dishonest means of gaining influence, with the writer ordering readers to feel a certain way about a subject rather than causing them to feel that way, which takes a little bit of effort and skill. If you were to read a straightforward account of the crimes of Jack the Ripper, nobody would need to tell you that you should be shocked and disgusted by them. Margaret Thatcher did not need to tell anybody that she was a lady, or that she was powerful — the facts of the case were enough.

 

But the facts of the case are not always with you. A particularly dopey CNN headline over a particularly dopey Chris Cillizza piece reads: “The Republican convention just proved this scary fact about the GOP.” The “scary fact” is that the most prominent voices and faces of the Republican Party in 2020 exhibited a cultish devotion to the president, and that they were willing to subordinate “what the party believes in” to the project of seeing him reelected. “Scary” means “causing fright or alarm.” Because this is not 1972, it is not even surprising, much less frightful, to discover that Republican bosses do not believe in much. Slavish and, indeed, idolatrous devotion to presidents has been a fact of American political life for a long time now, from John Kennedy to those Hollywood dolts singing literal hymns of praise to Barack Obama a few years back. Cillizza repeats several bits of over-the-top praise of Trump at the 2020 convention from the likes of Kimberly Guilfoyle and Charlie Kirk. And Guilfoyle and Kirk are very much representative of current Republican leadership.

 

That isn’t scary — the word you’re looking for here is embarrassing.

 

New York magazine, in a rare display of wit, offered a spin on the formulation: “On Guns, Liberals Are Flirting with the Politics of Fear. That’s Scary.”

 

Scary also contributes to another regrettable aspect of our political journalism: Putting the writer at the center. All of that performative “empathy” (they mean sympathy) in our journalism and our politics is a way of saying: “Look at me! Because what is interesting here is not how these poor people are suffering from famine or drought or plague or the aftermath of an earthquake but how I feel about it — and how I feel about it reflects very well on me, indeed!” Scary is a minor-league version of that: “All these socialists in the Democratic Party are scary! Really scary! Really, really scary! See, I feel exactly the same way you rubes do, which means you can trust me — and, now, a word from our sponsors!”

 

Leaning on scary causes much of our political journalism to read as though it were written and edited by eleven-year-olds (the wrong kind of eleven-year-old — you know the type) but it also obscures the actual considerations before us by fortifying the good-guys/bad-guys, white-hats/black-hats approach to politics. Intelligent and responsible political debate recognizes that very little of what’s at controversy in our public life has to do with unadulterated good and evil, or absolutes of any kind, but instead involves balancing competing goods, tradeoffs, and prioritizing among legitimate interests. Take, for example, the recent debate over a new round of COVID-relief checks. It can be simultaneously true that (1) this measure will have the unhappy effect of reinforcing and legitimizing the “waiting for my check” model of government; and (2) that it is necessary or prudent in the current situation. Each of those must be taken into consideration and weighed against the other. That is what functional political debate mostly does. Declaring those with views and priorities different from your own scary — or extreme or whatever other moralistic adjective is in vogue at the moment — isn’t a way to advance that debate, but a way to avoid having the debate at all, and to prevent its being had by others. And it is necessary for us to have the real debate: Even where there are genuine moral fundamentals at stake — as in the matter of abortion — we still are faced with competing goods, and a democratic conversation that cannot acknowledge that and make provision for it eventually will produce a democracy that is ineffective and unstable.

 

Democracy is much more the result of conversations than the result of votes. It is for this reason that in modern totalitarian regimes, under which the people neither have access to the ballot box nor to the ammunition box, rulers who will never face election and who face scant prospect of revolution nonetheless find it necessary to suppress and distort public discourse. If people are permitted to speak freely, then they will start to figure things out — and that is indeed scary, if you are a tyrant. The emerging totalitarian tendency in the United States is in much the same way focused on suppressing, perverting, and controlling language. For modern political journalism, the pursuit of public influence and the pursuit of commercial success are conjoined by sheer quantifiable reach — commodity eyeballs — which is why such nonsense as scary so frequently disfigures otherwise erudite outlets.

The Appalling Hypocrisy of Woke Corporations

By Madeleine Kearns

Tuesday, December 29, 2020

 

The infuriating thing about “virtue signaling” is not only that it is easy but also that it is profitable. Real virtue, especially courage, demands something of you. Not so with today’s prevailing sanctimony. Pose with a fashionable cause and the reward is instantaneous. Post something woke on Facebook. Kid yourself you’re brave. Reel in the likes. Enjoy the dopamine.

 

Alternatively, if you are a corporation, woke-washing can help boost your brand. A good example of this is when Nike recruited Colin Kaepernick for its “Dream Crazy” commercial in which the former NFL quarterback said, “Believe in something, even if it means sacrificing everything.” Shortly thereafter, Nike’s stock rose by nearly 5 percent. The company even won an award for “outstanding commercial” at the Creative Arts Emmys. Supported by Kaepernick’s woke intervention and social-justice street cred, Nike managed to rake in $6 billion. Such is its commitment to racial justice that in the week after the George Floyd riots, it released another ad urging people not to “sit back and be silent” but rather to “be part of the change.”

 

One would assume, then, that this noble corporate giant, so attuned to and invested in civil rights and social justice, would be just as vocal in opposing slavery in the 21st century. Apparently not. Earlier this year, reports from Congress and the Australian Strategic Policy Institute (ASPI) found evidence “strongly suggest[ing] forced labour” of the Uyghur Muslims and other minorities under the Chinese government. According to the ASPI, “Uyghurs are working in factories that are in the supply chains of at least 82 well-known global brands in the technology, clothing and automotive sectors, including Apple, BMW, Gap, Huawei, Nike, Samsung, Sony and Volkswagen.”

 

The vice president of global footwear sourcing and manufacturing at Nike told the British parliament that “Nike does not source any raw cotton. And regarding Xinjiang, Nike has confirmed with its suppliers that there are no spurn yarns or textiles manufactured in the area in our products.” But something doesn’t quite add up. The New York Times highlighted reports of “Uighur workers in a factory in Qingdao that makes Nike shoes.” And along with Coca-Cola, Nike has been busy lobbying Congress to weaken the Uyghur Forced Labor Prevention Act, which passed the House by a margin of 406–3 in September. Is this their idea of being “part of the change”?

 

Those who imagine that they would have been part of the anti-slavery movement had they been born in previous centuries ought to be greatly exercised by the reports coming out of China. The ASPI investigation found that at least 80,000 Uyghurs were transferred out of Xinjiang to work in factories across China between 2017 and 2019, some coming directly from detention camps. (They note that this number is merely an estimate and likely to be “far higher.”) The report continues:

 

In factories far away from home, [Uyghurs] typically live in segregated dormitories, undergo organized Mandarin and ideological training outside working hours, are subject to constant surveillance, and are forbidden from participating in religious observances. Numerous sources, including government documents, show that transferred workers are assigned minders and have limited freedom of movement.

 

Even more horrifying is the separate Associated Press report that detected the existence of family-detention centers where Uyghur women are subject to forced sterilization and abortions, as well as other appalling physical and psychological abuse. Just weeks after the AP’s report, drone footage of Uyghurs being blindfolded, handcuffed, and led onto train cars went viral, too. Indeed, the evidence of China’s grave human-rights abuses and use of forced labor can hardly be ignored. Every time American consumers come across the words “Made in China” on a clothing label, they ought to recoil. Nevertheless, the onus is still on the retailers to stop cooperating with slave masters.

 

Corporate wokeness is the worst kind because of its hypocrisy and blatant self-interest. The Co-op launched a “gender-neutral” gingerbread person. Starbucks came up with a special mermaid cookie to help raise money for a sinister child sex-change charity. Marks & Spencer had an LGBT sandwich (the “G” being guacamole). As customers, we are all well acquainted now with emails and company messaging about pride month and Black Lives Matter. We have all grown accustomed to this nauseating pretense. And yet, when it comes to a highly profitable form of modern-day slavery — where are the woke corporations? Silent or covering themselves.

Behold, the Delivery Revolution

By Rich Lowry

Tuesday, December 29, 2020

 

It’s been a terrible year for the American worker, with a notable bright spot courtesy of one of the tech firms in the crosshairs of regulators and lawmakers.

 

If someone had said early in 2020, “A company is going to hire hundreds of thousands of non-college-educated workers during the pandemic at well above the minimum wage,” you’d think there’d be huzzahs all around.

 

That’s what the online retailer Amazon has done, but it still gets brickbats for how it pays and treats its workers. Representative Alexandria Ocasio-Cortez said the other day that Amazon jobs are a “scam.”

 

If so, a swath of the American workforce is falling for the grift. Since July, the online retailer has hired 350,000 workers and now employs 1.2 million people globally. This is a historic hiring binge. According to the New York Times, “the closest comparisons are the hiring that entire industries carried out in wartime, such as shipbuilding during the early years of World War II.”

 

On top of this, the company provides work for roughly half a million truck drivers.

 

Amazon has been buoyed by the surge of online retail during the pandemic, which has accelerated and entrenched e-commerce. Companies such as Walmart and Target have benefited, too, but Amazon leads the pack.

 

It overwhelmingly hires high school graduates. It doesn’t ask for a resume, gives its workers about a day of training, and then puts them on the job in its fulfillment centers.

 

The difficulty of the work shouldn’t be underestimated — it is taxing, repetitive, and so highly regimented that it would make the legendary apostle of industrial efficiency Frederick Winslow Taylor blush.

 

Yet, we’ve long complained about losing assembly-line jobs for non-college-educated workers. Amazon is hiring people for what is the 21st-century equivalent of such jobs, which were — despite the nostalgia for them — also tough and physically demanding.

 

It can’t be that office work is now the only acceptable form of employment in America.

 

Amazon began paying its workers $15 an hour in 2018. If that rate rings a bell, it’s the number for the federal minimum wage that Senator Bernie Sanders and AOC have long been lobbying for, to little effect (it remains $7.25 an hour).

 

The evidence is that when a behemoth like Amazon pays more, it prompts competitors to follow suit.

 

It’s hard to review what Amazon has done over the past year and consider it the work of a corporate monster. The company had an unlimited unpaid-time-off policy for its workers when the pandemic began.

 

It hired temporary workers to replace them and deal with the surge of business, then kept most of them on and began hiring on top of that.

 

It’s been offering signing bonuses of up to $3,000, and hiring in places in the country where no one else is.

 

According to the research of Michael Mandel at the center-left Progressive Policy Institute, Amazon fulfillment center jobs pay 31 percent more than retail jobs at brick-and-mortar stores, where pay has basically been stagnant for three decades.

 

Mandel points out that it’s wrong to simplistically think of Amazon and other e-commerce outfits as replacing brick-and-mortar stores.

 

What they are really replacing is the labor that consumers undertake on their own to shop for goods — driving to a store, walking up and down the aisles, making the selection, loading it, and taking it home. Someone making a purchase through Amazon essentially hires a network of workers to do all of that for him.

 

What Amazon, and e-commerce more broadly, is doing is selling goods to consumers at low prices, while giving them more convenience than ever before (rapid delivery to their doorsteps, with the possibility of easy returns) and creating new jobs in the process.

 

By all means, jawbone the company to treat workers better, but don’t lose sight of the scale of its achievement — and how many Americans are employed because of it.

Monday, December 28, 2020

The Coming Global Backlash against China

By Helen Raleigh

Monday, December 28, 2020

 

The Chinese Communist Party’s leader, Xi Jinping, is the most powerful leader in Communist China since Chairman Mao. Yet, Xi’s outward strongman image is a veneer over his inner insecurity. When he came into power in late 2012, China’s economy had slowed down from double-digit growth to single-digit growth; the mass working-age population, which had been the engine of China’s economic growth, has begun to decline. The Center for Strategic and International Studies (CSIS), a Washington, D.C.–based think tank, projects that by 2030, “China will round out its thinning labor force by hiring workers from abroad.” At the same time, according to Mark Haas, a political-science professor at Duquesne University, “China alone in 2050 will have more than 329 million people over 65.” Consequently, China is expected to be the first major economy that will grow older before it achieves widespread prosperity.

 

Without its demographic dividend and with an aging population, China’s economic growth will further slow down at the time when the government needs to keep its growing middle class from demanding a level of political freedom matching their newfound wealth. An aging population would also force the government to allocate more national resources for elder care and social services, which means there will be fewer resources to compete against the U.S. This is probably one of the most important reasons why Xi feels that he has to abandon the so-called strategic-patience guidance issued by Deng Xiaoping, the paramount leader of China from 1978 to 1997, who instructed his comrades to bide their time and avoid any confrontation with powerful external forces until China was in a much stronger position both economically and militarily.

 

Xi, however, believes that China can’t afford to bide its time any longer. It must replace the liberal world order with a Sino-centric world order before China’s population becomes too old and the Chinese economy becomes too stagnant. However, rather than furthering economic reform and opening up more sectors to foreign investment and competition to strengthen its economy, Xi chose to hide China’s weaknesses and exaggerate China’s economic strengths. He emphasizes self-reliance and utilizing China’s resources to pump up “national champions,” or state-owned enterprises that could compete against global leaders in strategic sectors. Xi feels that nationalism is his new trump card, something he can use to motivate, excite, and unite a billion people all the while strengthening the CCP’s rule over them. Others say that his inward-looking nationalist policies are leading China to the very middle-income trap — in which China’s level of development stalls out before reaching the heights of other modern industrial nations — that Xi and his predecessors tried very hard to avoid.

 

Yet the more the Chinese economy slows down, the more Xi feels the need to project a strongman image both abroad and, especially, at home. As Wang Gungwu and Zheng Yongnian, two Chinese scholars, wrote in China and the New International Order, this dynamic has deep roots in Chinese history: “China’s internal order was so closely related to her international order that one could not long survive without the other; when the barbarians were not submissive abroad, rebels might more easily arise within. Most dynasties collapsed under the twin blows of inside disorder and outside calamity, nei luan wai huang, that is, domestic rebellion and foreign invasion.”

 

Xi is keenly aware that he is vulnerable to internal rebellion. He has purged more than 1.5 million government officials, military leaders, and party elites. His trade war with the U.S. is deeply unpopular inside China because it has caused economic pains such as rising unemployment, closing of factories, and the shifting of the global supply chain out of China. Xi knows very well that if he shows any signs of weakness, he may end up like his political rival, Bo Xilai — a princeling who is currently languishing in a notorious Chinese prison for high-level party officials.

 

In addition, Xi saw former U.S. President Obama as a “weak” leader who led a nation that was on its way to inevitable decline, which opened up an unprecedented opportunity for China. Xi also has certain milestones he wants to reach: In 2021, the 100th anniversary of the founding of the Chinese Communist Party, and in 2049, the 100th anniversary of the founding of Communist China. Xi wants to do something big to cement his place in history when he reaches these milestones. Therefore, in his mind, the era of hiding strength and biding time is over. He wants to show the world a new set of policies, actions, and attitudes that match China’s powerful status.

 

For a while, Xi was succeeding. Internally, he ruthlessly cracked down on religious believers, political dissenters, party officials, and business elites. He also built a mass surveillance state that turned the dystopian nightmare imagined by George Orwell’s 1984 into a reality. Internationally, he imposed his strong will on businesses and nations big and small through his signature project “One Belt and One Road.” The way Xi sees it, the more other countries become economically dependent on China, the more he can dominate them peacefully without having to use force. One commentator has observed that Xi “resembles a clenched fist. At home, he is clenching hard to assert his control. To the outside world, he is a hard-thrusting force determined to get his way.” Xi’s fist has conditioned many nations including the Western democracies to believe that China is stronger than it actually is and that China’s global dominance is inevitable. Therefore, few are willing to challenge China’s human-rights violations at home and its assertive behavior abroad.

 

But even the most powerful emperor can fly too close to the sun. The dissenting voices inside China are getting louder, while global backlash against China reached new heights in 2019. Then the 2020 coronavirus outbreak stripped the facade of Xi’s powerful image, revealed deep flaws within the CCP’s dictatorial political system, caused immense anger and frustration among Chinese people, brought serious detriments to China’s prestigious international image, and brought China’s seemingly unstoppable rise to a halt. As the prominent Hong Kong entrepreneur Jimmy Lai has written, “The more Mr. Xi pursues his authoritarian agenda, the more distrust he will sow at home and abroad. Far from transforming Beijing into the world’s leading superpower, his policies will instead keep China from taking its rightful place of honor in a peaceful, modern and integrated world.” Xi has misread the situation, overplayed his hand, and his aggressive policies at home and abroad have backfired, proving the saying: Those whom the gods would destroy, they first make mad.

America’s COVID Detractors Owe the U.S.A. an Apology

By Cameron Hilditch

Monday, December 28, 2020

 

At the beginning of June, Aris Roussinos, a contributing editor at Unherd, wrote a piece arguing that “Covid has exposed America as a failed state.” It was chock-full of the kind of tired and recycled clichés one often comes across in old-world screeds directed at the United States and its tenure as a global hegemon. I wrote a quick reply at the time because I’m a great admirer of Roussinos’s writings and I felt then (and still do) that his hatred for American foreign policy won out over and against his higher critical faculties when he wrote that piece. The notion that the United States had been dealt its civilizational death blow by this virus seemed facially absurd at the time.

 

Six months later, I can’t help but notice that the first two coronavirus vaccines out of the gates have been produced by Pfizer and Moderna, two American companies. The vast majority of those citizens of the world who have been inoculated up to this point are direct beneficiaries of Uncle Sam’s latest feat of dizzying ingenuity.

 

It’s a story that shouldn’t surprise anyone with a passing acquaintance with the history of the last hundred years. The U.S.A. has been ceaselessly written off in the past as a decadent, materialistic, soulless, and uncultured wasteland by preening Europeans. But when sh** hits the fan and the American people turn their energies towards a single great task, the story usually ends with men on the moon, life-saving medicines, or T14 tanks rolling over the gates of Dachau.

 

Mr. Roussinos and others who danced prematurely on America’s grave this year will be pleased to learn that the United States accepts apologies in both written and verbal form.

What Are the Consequences of Left-Wing Campus Culture?

By Isaac Schorr

Monday, December 28, 2020

 

Having developed a mild-to-moderate case of misanthropy from my own political exploits in college, I was admittedly disinclined to take the Twittersphere’s word for it that a “new study shows” that the higher-education system does not push students to the left. When one of the study’s architects, Professor Logan Strother of Purdue University, even went so far as to suggest that it disproved the assumption of “many conservative pundits and politicians” that “college education ‘indoctrinates’ students into liberal or leftist ideologies,” I felt compelled to at least give it a read. But upon closer examination, my initial suspicions were confirmed: The methodology and findings of Professor Strother’s study should do very little to assuage the fears of conservative critics of higher education.

 

The study aimed to measure the effect that roommates had on each others’ political development and the effect of the university in general on students’ views. This was done by asking freshmen to classify themselves as “far right,” “conservative,” “middle-of-the-road,” “liberal,” or “far left” once early in the fall semester and then again in the spring semester. Results showed slight shifts away from “liberal” and toward “middle-of-the-road” and “conservative” self-identification. The researchers then performed a two-tailed t-test that found the rightward shift to be statistically significant, leading them to the following conclusion:

 

Our study shows that the ideology of first-year college students in our sample does not change much over the course of their first year on campus, contrary to the stated fears of many high-profile conservative pundits. Moreover, to the extent that there are aggregate changes, they are generally in the conservative direction. . . .

 

I wish I could say that the study should alleviate the “fears of many high-profile conservative pundits.” But it shouldn’t, so I can’t.

 

Its most obvious design flaw is that it only measures, or rather purports to measure, changes in ideology over the course of students’ freshman year. Using the results of such a study to declare that college doesn’t make students more liberal is like using a three-point lead at the end of the first quarter of a football game to declare yourself the victor. And in this context, it betrays a fundamental misunderstanding of how students’ ideologies change over the course of their time in college.

 

By and large, students don’t become radicals because they spend their first or second semester being taught introductory sociology or political science by a far-left professor who inspires them to pretend to have read and agreed with Marx. It’s more often a gradual process that sees them become increasingly progressive as they climb the ranks of student organizations and take on leadership roles. Most freshmen arrive on campus wanting to think of themselves as open to others’ perspectives, even conservatives ones. But as they pursue positions in advocacy, pre-professional, and student-government groups as well as prestigious honor societies, the incentives shift. It becomes unfashionable to remain friendly with the Republican from their freshman-year floor. In fact, it is in some cases demanded by their social circles that they publicly denounce campus conservatives.

 

Another glaring issue with the study’s methodology is how it measures ideology. By asking students what they consider themselves to be rather than designing a series of basic questions about their political views, the researchers throw into doubt the validity of their data and the conclusions they draw from it. Although they claim to be gauging political views, what they are really gathering is students’ understanding of where they fit on a continuum of ideologies. Not only does this fail as a measure of their actual positioning relative to the population, but it is further skewed by the environment they’re in. After spending eight months taking classes taught by liberal instructors and living among a disproportionately liberal population, it stands to reason that they might think of themselves as a bit more conservative, regardless of their actual beliefs.

 

Finally, even the researchers admit that since this study took place during the 2009–2010 school year, its present-day usefulness is limited. A decade is a lifetime in American politics. When the study was conducted, President Barack Obama still opposed gay marriage. The emergence of a more radical sexual politics and the rehabilitation of far-left economic views by Bernie Sanders had yet to transform our polity, and nowhere has the impact of those developments been felt more acutely than in the university.

 

Funnily enough, even granting all those caveats, I would still agree with the study’s authors that college does not “indoctrinate” most students. Most students — like most Americans — take only a passing interest in politics if they take any interest in it at all. I would suspect that while the average student moves leftward as an undergraduate, to describe this change as “indoctrination” is to overstate it. But for those students who seek out positions of relative power on campus and real power in government and journalism afterward, there is no doubt that the leftward shift experienced in college is much more dramatic.

 

The portrait painted by some genuinely concerned conservatives and some cynical ones of all-encompassing leftist indoctrination on campus is misleading. Professors’ bias is a problem, but it is far from the biggest issue with campus political culture. Peer pressure and the resulting incentives for radicalization are the primary cause of a leftward shift in students’ politics over their college careers, and this shift is largely concentrated among the politically interested and “movers and shakers” in campus politics. The straw man of totalized “indoctrination” thus does conservatives a disservice: It gives liberal researchers an excuse to ignore the very real consequences of left-wing campus culture, an excuse they are all too willing to take advantage of.

Sunday, December 27, 2020

The Federalist Papers: Instruction Manual for the Constitution

By Charles C. W. Cooke

Thursday, December 03, 2020

 

Most countries don’t have an instruction manual. But, then, most countries don’t need one. In its brilliance, in its longevity, and in its dual role as outline for legitimate government and incubator of well-considered ideals, the United States Constitution is unique. At many points in history, nations have elected to rewrite their rulebooks, but rarely have those who were charged with the task thought so seriously and assiduously while doing so. Critics of the American settlement sometimes sneer that its champions believe it to be “perfect.” We do not, although the Burkeans among us may believe that it is superior to anything that might realistically replace it. But we do believe that it is the product of genius and of deliberation, and that it ought to be taken seriously as a result. Contrary to what you’ve likely been told by our new class of Jacobins, the Constitution is not arbitrary, it is not a “big lie,” and it is not a self-interested prophylactic that was designed and sustained by the ruling class. It is a masterwork of practical application and moral philosophy. Yes, it contains compromises, but they are mostly compromises born not of grubby self-dealing or contingent circumstance but of sustained argument. Those arguments remain instructive. Better still, they remain persuasive.

 

Those looking for information about why the Constitution says what it says will find a treasure trove waiting for them in the archives. There is a well-organized set of notes from the Constitutional Convention, a less well-organized assortment of briefs from the subsequent fights over ratification, and the reams upon reams of newspaper commentary that was designed to illuminate, to obfuscate, and more. But the jewel in the crown is undoubtedly The Federalist Papers, an 85-entry compilation of contemporary missives that comprises the work done by James Madison, Alexander Hamilton, and John Jay to convince their fellow Americans that it was worth junking the existing Articles of Confederation and adopting a new constitution in its place. It is a tour de force that, among other things, helps us to understand why the American order — a mixture of humble mists-of-time traditionalism and elastic classical liberalism — ended up as it did.

 

Much hay has been made of the fact that the charter that Madison, Hamil­ton, and Jay were hawking was devised in secret by a small, self-selected group of men who had not in any meaningful sense been recruited to redraft the republic. From time to time, one even sees the process described as a “coup.” And yet the mere existence of The Federalist Papers remind us how flippant a characterization this is. As the authors make clear from the outset, ratification of the new constitution was by no means a given. The task of the project, Hamilton explains in Federalist No. 1, was “to give a satisfactory answer to all the objections which shall have made their appearance, that may seem to have any claim to your attention.” Those objections were real, taking their most potent form in a series of eloquent counter-arguments — known today as the Anti–Federalist Papers — that would prove convincing enough to the public to guarantee that ratification would be conditioned on the adoption of the Bill of Rights, which neither Hamilton nor Madison considered necessary, given that it seemed to contradict the enumerated-powers doctrine. The question before the people, Hamilton proposed while introducing the effort, was whether Americans would enjoy a form of “good government from reflection and choice, or whether they are forever destined to depend, for their political constitutions, on accident and force.” And, as he was at pains to confirm, the only “force” available to him and his colleagues flowed directly from the nibs of their pens. The charter for which the trio was proselytizing may indeed have been written by a cabal. But its adoption would be debated by everyone.

 

Ultimately, the authors of The Federalist Papers prevailed — either directly, by influencing those who read them, or, more commonly, by influencing other writers who echoed their arguments around the new country. Were that the project’s sole legacy, it would have represented a smashing success — perhaps the greatest marketing exercise in history. But it has turned out to be much more than just that. Within a year of ratification, it had become apparent that, as a set of explanatory strictures, The Federalist Papers could prove key to the maintenance of the new order­ — a maintenance that, given the recent experience with the Articles of Confederation, was by no means guaranteed. And so they have continued to prove. Thomas Jefferson hoped that the new Consti­tution would continue to mean what it meant when it was ratified and thereby avoid transmutation into “blank paper by construction.” Jefferson’s wish has not always been granted, but, when it has, The Federalist Papers have played a foundational role in the preservation. Originalism is in part a historical exercise, and it relies for its integrity upon our having enough contemporary explanations, arguments, and definitions to put the matter of the Constitution’s binding meaning beyond a reasonable doubt. There is no better compendium of such explanations, arguments, and definitions than The Federalist Papers. Want to know what a word meant back in 1787? To learn the broadly understood intent and scope of a particular provision? To know what the authors and their allies explicitly did not mean — even what they resented being told they meant? To understand federalism or counter-majoritarianism or separation of powers? There’s a set of papers for that.

 

The fight for the original public meaning of the Constitution is, at root, a political fight. Whatever the truth, and whatever writings we have, it will remain the case that if the people do not want to keep their highest law, that law will eventually be discarded. But it’s a lot tougher simply to make up that meaning when the guys who wrote and defended the thing are on record with clear, concise, and definitive explanations of their own. It should be no surprise, perhaps, that, next to the Constitution itself, The Federalist Papers have been the most commonly cited source by the Supreme Court since it began pronouncing on questions of constitutional interpretation. We would be better off if the other branches paid as much attention as do the courts — and if the citizenry followed suit. There’s an education in those documents — for those who choose to take it.

Free Markets Can Appeal to the Working Class

By Amity Shlaes

Thursday, December 03, 2020

 

You can’t get the votes, idiot.

 

That’s how policy analysts rebut anyone who suggests that the best Republican platform is the old one. They say the era of advancing the abstractions of traditional liberty, austere government, and low taxes is past; to win, Republicans have to act like Demo­crats, offering social programs, child credits, and cooperation with labor unions. Or, as Julius Krein, the editor of American Affairs, put it in a recent William F. Buckley Jr. Program event at Yale: “The Democrats represent the ascendant economic winners. I don’t think that another lecture on Hayek is going to change that for anybody.” Markets come second, and federal spending doesn’t matter anyhow — modern monetary theory assures us of that. This is the moment of Hillbilly Elegy, not Free to Choose.

 

But evidence from another era — the 1920s, especially the presidency of Calvin Coolidge — suggests that sticking to free markets can get Republicans votes, even from the working class. And, far more important, sticking to markets can yield an economy that benefits those workers — along with everyone else.

 

The mood of the nation in 1923, the year Warren Harding died and Coolidge succeeded him as president, was strikingly similar to that of today. Progressivism was on the march, even within the Grand Old Party. The 1920 Republican platform, emphasizing free markets at home, already looked retro. Progressives were in the process of founding their own party, ripping away a daunting share of the Republican constituency. Coolidge’s prospects for winning election in 1924 were far from assured. Harding had besmirched the presidency and the free-market ideal itself with Teapot Dome, a scandal in which the president’s cabinet and friends favored friends in the privatizing of oil-rich government lands. Only high earners paid federal income taxes then, so the GOP’s income-tax-cut plans were ridiculed just as capital-gains treatment for carried interest is today: sops to the rich. Nor was all well in the economy: Commodity prices had plunged, with many farmers facing foreclosure.

 

As a new president, Coolidge had the chance to retool his party, introduce compassion, disavow magnates as robber barons, aid farmers, and woo back some Progressives. In the race for economic primacy, America had moved ahead of Britain, but sustaining first place was anything but sure. In Britain, politicians were opting for what today’s politicians would call social-democratic moves: London’s compassion included the then-new dole, an unemployment payment. Perhaps American Republicans, too, should opt for markets with a human face. That was the bet of Republican comers such as Herbert Hoover. Hoover, a technocratic consultant, a sort of pre-Romney, publicly ridiculed proponents of pure laissez-faire philosophy.

 

The other choice for the man moving into the White House was to push the old, frosty, abstract Republican program: austerity cuts, spending vetoes, tax cuts for high earners, and support of freer markets. The party could then explain — a tough challenge — that the results would trickle down to the lower earner. Having observed the absurdities of Prohibition enforcement in real time, Coolidge had no intention of writing further “pro-family” laws. The better policy for firming up American primacy, Coolidge wagered, was to opt for old and cold. Coolidge was therefore also wagering that even blue-collar workers would understand.

 

The accidental president started his work by pushing a policy Republicans wouldn’t dare to articulate today: austerity. “I am for economy, and after that I am for more economy.” The farm lobby expected that Vermont-born Coolidge would accept the McNary-Haugen bill, a bipartisan agricultural-subsidy measure voted through by both houses. But Coolidge vetoed McNary-Haugen twice, along with dozens of other “compassionate” laws.

 

Presidents who lead by example get more support for their policies. At the White House itself, Coolidge also modeled austerity, going so far as to lay off a housekeeper, Mrs. Jaffray, who had been there since William Taft. Her habit of frequenting costly specialty shops irritated the chief executive. Beguiled by the then-young concept of the economy of scale, Coolidge sent his staff to shop at a supermarket. He advertised his commitment to business and markets everywhere, right down to his pets: When Coolidge received a gift of twin lion cubs, he named them not “Champ,” “Bo,” or “Barney” but “Tax Reduction” and “Budget Bureau.” By the time Coolidge left office, he had, through upright behavior, wiped the Harding stain on the presidency clean away.

 

With his Treasury secretary, Andrew Mellon, Coolidge mounted a tax-cut campaign. To get his first income-tax cuts, Coolidge had to give Progressives a concession, instantly dubbed the Peeping Tom rule: Under the new law, authorities affixed the amount of taxes that individuals paid beside their names on the walls of post offices across the land. (The New York Times helped out the class warriors by publishing an alphabetical list of taxpayers and their payments.) Once the 1924 cut was law, Coolidge managed another, taking the top rate down to a Reagan-like 25 percent. The unions? Coolidge mostly ignored them.

 

Coolidge practiced this policy not because he didn’t care for the disadvantaged but because he did. He believed, and the evidence began to show, that lower taxes on high earners would cause them to do the best thing one can do for workers: provide more jobs. Coolidge supported federal spending on education from time to time, and explicitly en­dorsed subsidy for the medical school at Howard University. He loudly affirmed blacks’ right to run for office and signed a law that ensured all Native Americans were citizens. But Coolidge did not infantilize voters or ghettoize groups with direct payments.

 

Voters warmed to Coolidge’s strategy of cold. In 1924 the Progressives mounted their own candidate, Bob La Follette, who won a Perot-esque 17 percent of the vote (the Democratic candidate, John W. Davis, garnered 29 percent). Coolidge won an absolute majority, 54 percent, and, now elected in his own right, doubled down on budgeting and low taxes. The economy grew strongly in the 1920s. Unemployment stayed low and union membership declined, as did the Ku Klux Klan. Most Americans, including those hillbillies who went north to work in Detroit, benefited: The 1920s were the years when the factory work week went from the traditional six days to five. The shift was possible owing not to unions but to productivity gains. In short, the wealth really did trickle down. The class-war cartoon of America’s economy faded. In his 1926 tax law, Coolidge even managed to get through a section repealing the creepy Peeping Tom provision.

 

Coolidge had his warts: His support for tariffs, especially those imposed on beleaguered Cuba, hurt the cause of democracy. (If you wanted to twist this argument hard enough, you could argue that Coolidge set the stage for the rise of figures such as Fidel Castro.) But Coolidge backed free markets strongly enough for America to pull farther ahead of Britain. By the late 1920s, the word “dole” had become a pejorative so strong that even President Roosevelt would eschew it. America of the balanced budget had gained a firm purchase on the No. 1 spot among economic powers.

 

The first takeaway here is that even ur-Reaganite policy can succeed even in a progressive moment. Second, character does matter — in individual leaders’ comportment more than in allegedly pro-family legislation. Third, “hillbilly policy” disses hillbillies. The current hillbilly narrative notwithstanding, Americans are essentially alike in their desire to see their children do better than they have. Education and opportunity, not universal-income payments layered on top of other forms of welfare, are still the answer there. Yet Republican strategists insist on crafting ultra-targeted faux-federalist fusion “nudge” packages à la Cass Sunstein in the name of securing votes.

 

The irony here is that the American invincibility that Coolidge helped to establish appears to obviate the urgency for Coolidge-level fiscal discipline today. But only appears to. That’s because one day soon, the dollar, and with it our economy, is likely to be challenged by a serious competitor. The prima facie evidence that distrust in the dollar already looms is the improbable rise of  Bitcoin. A more credible dollar challenger will emerge one day. Then simple, small-“r” republican values will be worth their weight in not only Bitcoin but also euros or gold. And then the party that has ostentatiously abjured these principles won’t get the votes.