Thursday, October 31, 2019

Is California Becoming Premodern?


By Victor Davis Hanson
Thursday, October 31, 2019

More than 2 million Californians were recently left without power after the state’s largest utility, Pacific Gas and Electric — which filed for bankruptcy earlier this year — preemptively shut down transmission lines in fear that they might spark fires during periods of high autumn winds.

Consumers blame the state for not cleaning up dead trees and brush, along with the utility companies for not updating their ossified equipment. The power companies in turn fault the state for so over-regulating utilities that they had no resources to modernize their grids.

Californians know that having tens of thousands of homeless in their major cities is untenable. In some places, municipal sidewalks have become open sewers of garbage, used needles, rodents, and infectious diseases. Yet no one dares question progressive orthodoxy by enforcing drug and vagrancy laws, moving the homeless out of cities to suburban or rural facilities, or increasing the number of mental hospitals.

Taxpayers in California, whose basket of sales, gasoline, and income taxes is the highest in the nation, quietly seethe while immobile on antiquated freeways that are crowded, dangerous and under nonstop makeshift repair.

Gas prices of $4 to $5 a gallon — the result of high taxes, hyper-regulation, and green mandates — add insult to the injury of stalled commuters. Gas tax increases ostensibly intended to fund freeway expansion and repair continue to be diverted to the state’s failing high-speed rail project.

Residents shrug that the state’s public schools are among weakest in the nation, often ranking in the bottom quadrant in standardized test scores. Elites publicly oppose charter schools but often put their own kids in private academies.

Californians know that to venture into a typical municipal emergency room is to descend into a modern Dante’s Inferno. Medical facilities are overcrowded. They can be as unpleasant as they are bankrupting to the vanishing middle class that must face exorbitant charges to bring in an injured or sick child.

No one would dare to connect the crumbling infrastructure, poor schools, and failing public health care with the non-enforcement of immigration laws, which has led to a massive influx of undocumented immigrants from the poorest regions of the world, who often arrive without fluency in English or a high-school education.

Stores are occasionally hit by swarming looters. Such Wild West criminals know how to keep their thefts under $950, ensuring that such “misdemeanors” do not warrant police attention. California’s permissive laws have decriminalized thefts and break-ins. The result is that San Francisco now has the highest property crime rate per capita in the nation.

Has California become premodern?

Millions of fed-up middle-class taxpayers have fled the state. Their presence as a stabilizing influence is sorely missed. About one-third of the nation’s welfare recipients live in California. Millions of poor newcomers require enormously expensive state health, housing, education, legal, and law-enforcement services.

California is now a one-party state. Democrats have supermajorities in both houses of the legislature. Only seven of the state’s 53 congressional seats are held by Republicans. The result is that there is no credible check on a mostly coastal majority.

Huge global wealth in high-tech, finance, trade, and academia poured into the coastal corridor, creating a new nobility with unprecedented riches. Unfortunately, the new aristocracy adopted mindsets antithetical to the general welfare of Californians living outside their coastal enclaves. The nobodies have struggled to buy high-priced gas, pay exorbitant power bills, and deal with shoddy infrastructure — all of which resulted from the policies of the distant somebodies.

California’s three most powerful politicians — House speaker Nancy Pelosi, Senator Dianne Feinstein and Governor Gavin Newsom — are all multimillionaires. Their lives, homes, and privileges bear no resemblance to those of other Californians living with the consequences of their misguided policies and agendas.

The state’s elite took revolving-door entries and exits for granted. They assumed that California was so naturally rich, beautiful, and well-endowed that there would always be thousands of newcomers who would queue up for the weather, the shore, the mountains, and the hip culture.

Yet California is nearing the logical limits of progressive adventurism in policy and politics.

Residents carefully plan long highway trips as if they were ancient explorers charting dangerous routes. Tourists warily enter downtown Los Angeles or San Francisco as if visiting a politically unstable nation.

Insatiable state tax collectors and agencies are viewed by the public as if they were corrupt officials of Third World countries seeking bribes. Californians flip their switches unsure of whether the lights will go on. Many are careful about what they say, terrified of progressive thought police who seem more worried about critics than criminals.

Our resolute ancestors took a century to turn a wilderness into California. Our irresolute generation in just a decade or two has been turning California into a wilderness.

Obama Is Right To Criticize Cancel Culture, But The Left Isn’t Listening


By John Daniel Davidson
Thursday, October 31, 2019

Speaking at an Obama Foundation event in Chicago on Tuesday, former president Barack Obama made headlines when he criticized “woke” cancel culture and online outrage, cautioning young people that, “The world is messy. There are ambiguities. People who do really good stuff have flaws.”

“This idea of purity and that you’re never compromised and you’re always politically woke — you should get over that quickly,” Obama said.

His remarks are noteworthy not just because he’s correct about the frailty of human nature and the folly of rushing to condemn anyone who might disagree with you, but because his admonishment was aimed primarily at the left. Cancel culture is, after all, almost entirely a product of progressive activists seeking to punish anyone who doesn’t agree with them, doesn’t support their agenda, or holds views they find offensive. It’s the opposite of what “liberal” used to mean, and Obama’s old enough to know that.

His comments on Tuesday are worth quoting in full, because they highlight just how far the left has drifted from Obama’s current way of thinking. He said:

I do get a sense sometimes now among certain young people, and this is accelerated by social media — there is this sense sometimes of the way of me making change is to be as judgmental as possible about other people, and that’s enough. If I tweet or hashtag about how you didn’t do something right or used the wrong verb, then I can sit back and feel pretty good about myself. Did you see how woke I was, I called you out… That’s not activism. That’s not bringing about change. If all you’re doing is casting stones, you’re probably not going to get that far.

This isn’t the first time Obama has talked this way. At a town hall in 2015, he lambasted “coddled” liberal college students who can’t tolerate conservative speakers on campus. “Anybody who comes to speak to you and you disagree with, you should have an argument with them, but you shouldn’t silence them by saying you can’t come because I’m too sensitive to hear what you have to say,” he said.

To the Left, Cancel Culture Is Good

The irony here is that the left, which once lionized Obama and to some extent still does, isn’t interested in what he has to say about any of this. Today, cancel culture and online outrage are the modus operandi of progressive activists on campus and across social media.

Often enough, that outrage migrates offline. We’ve all seen the images and videos of students shouting down speakers, storming stages, ripping mics away from moderators, and shutting down conservative speaking events through intimidation. In the case of Charles Murray’s visit to Middlebury College in 2017, an actual mob attacked him and a faculty member, who suffered whiplash and a concussion.

As with most things that start on campus, cancel culture is now seeping into the mainstream, and especially in corporate America. Just last month, “Saturday Night Live” fired comedian Shane Gillis after videos of him using racial slurs went viral, despite a seemingly heartfelt apology from Gillis. Nike canceled its Betsy Ross flag sneakers this summer after Colin Kaepernick criticized the design as offensive. The Academy Awards told comedian Kevin Hart he had to step down as host of the Oscars unless he publicly apologized for some supposedly homophobic jokes he told years earlier. The list goes on.

For the left, this is all good news. The pressure that online outrage and cancel culture exert on celebrities, schools, corporations—even regular people no one’s ever heard of, like Iowan Carson King—is a welcome development in our society, not something to lament. In a recent take on cancel culture at the New Republic, Osita Nwanevu put a benign gloss on the whole thing because, after all, these are just tweets we’re talking about, it’s not like anyone’s really getting hurt: “Perhaps we should choose instead to understand cancel culture as something much more mundane: ordinary public disfavor voiced by ordinary people across new platforms.” (A recent Washington Post op-ed takes it a step further, declaring, “Critics of ‘cancel culture’ really just hate democracy.”)

Nwanevu argues that everyone who’s been “canceled” is doing fine. Aziz Ansari has a new Netflix special. Kyler Murray, the Heisman winner who last year had to apologize for homophobic tweets he’d written as a teen, is now the starting quarterback for the Arizona Cardinals. Dave Chapelle is a multi-millionaire.

The larger point here is that, far from being a problem, cancel culture is actually a good and salutary development in public life—a way for the marginalized and powerless to hold the powerful accountable (if by “powerless” you include all major universities, Hollywood, and most of corporate America). At the very least, writes Nwanevu, cancel culture “cannot really be understood as a response to the advent of an oppressively censorious monoculture,” because look, Sean Spicer was on “Dancing With the Stars.”

Obama Helped Sow the Seeds of Cancel Culture

But perhaps the best example of how far to the left progressives have drifted from Obama’s way of thinking is the ongoing spectacle of 2020 Democratic presidential candidates trying to out-woke each other. Beto O’Rourke, who’ll apparently say anything in hopes of staying in the race, wants to tax churches and other religious institutions that oppose gay marriage. Julian Castro wants to give free health insurance to transgender illegal immigrants. Even former vice president Joe Biden, the so-called moderate in the race, has been forced to tack left, abandoning his long-held support for the Hyde Amendment, which prohibits federal funding for abortion.

It’s commonplace on Democratic debate stages for the candidates (and moderators) to refer to President Trump as a racist. The arguable frontrunner, Sen. Elizabeth Warren, told NPR in August that “when the white supremacists call Donald Trump one of their own, I tend to believe them.”


It’s hard to imagine Obama engaging in this sort of rhetoric. But then again, the seeds of cancel culture and leftist outrage were undoubtedly planted during his tenure in the White House. Back in 2012, for example, Obama would never have openly called Mitt Romney a racist. But he didn’t have to because surrogates like Democratic National Committee Chairwoman Debbie Wasserman Schultz did it for him.

Ahead of the 2014 midterms, Rep. Nancy Pelosi said Republicans were reluctant to take up immigration reform because of race. That same year, Obama’s Attorney General Eric Holder suggested race was a factor in the “unprecedented, unwarranted, ugly and divisive adversity” he faced from GOP lawmakers.

Beyond race, it was Obama who popularized the idea that “the debate is over”—about health care, climate change, education, and much else. Arguably, “the debate is over” is now the mantra of the woke: if you disagree or offend, you get canceled.

So it’s all well and good that Obama is concerned about the corrosive effects of cancel culture and militant wokeness. He should be, as should we all. But his warnings and reprimands are a day late and a dollar short. The woke left isn’t listening, not even to Obama. To them, the debate is over.

First Common Core High School Grads Worst-Prepared For College In 15 Years


By Joy Pullmann
Thursday, October 31, 2019

For the third time in a row since Common Core was fully phased in nationwide, U.S. student test scores on the nation’s broadest and most respected test have dropped, a reversal of an upward trend between 1990 and 2015. Further, the class of 2019, the first to experience all four high school years under Common Core, is the worst-prepared for college in 15 years, according to a new report.

The National Assessment of Educational Progress is a federally mandated test given every other year in reading and mathematics to students in grades four and eight. (Periodically it also tests other subjects and grade levels.) In the latest results, released Wednesday, American students slid yet again on nearly every measure.

Reading was the worst hit, with both fourth and eighth graders losing ground compared to the last year tested, 2017. Eighth graders also slid in math, although fourth graders improved by one point in math overall. Thanks to Neal McCluskey at the Cato Institute, here’s a graph showing the score changes since NAEP was instituted in the 1990s.




“Students in the U.S. made significant progress in math and reading achievement on NAEP from 1990 until 2015, when the first major dip in achievement scores occurred,” reported U.S. News and World Report. Perhaps not coincidentally, 2015 is the year states were required by the Obama administration to have fully phased in Common Core.

Common Core is a set of national instruction and testing mandates implemented starting in 2010 without approval from nearly any legislative body and over waves of bipartisan citizen protests. President Obama, his Education Secretary Arne Duncan, former Florida Gov. Jeb Bush, Bill Gates, and myriad other self-described education reformers promised Common Core would do exactly the opposite of what has happened: improve U.S. student achievement. As Common Core was moving into schools, 69 percent of school principals said they also thought it would improve student achievement. All of these “experts” were wrong, wrong, wrong.

“The results are, frankly, devastating,” said U.S. Education Secretary Betsy DeVos said in a statement about the 2019 NAEP results. “This country is in a student achievement crisis, and over the past decade it has continued to worsen, especially for our most vulnerable students. Two out of three of our nation’s children aren’t proficient readers. In fact, fourth grade reading declined in 17 states and eighth grade reading declined in 31.”

On the same day the NAEP results were released, the college testing organization ACT released a report showing that the high school class of 2019’s college preparedness in English and math is at seniors’ lowest levels in 15 years. These students are the first to have completed all four high school years under Common Core.

“Readiness levels in English, reading, math, and science have all decreased since 2015, with English and math seeing the largest decline,” the report noted. Student achievement declined on ACT’s measures among U.S. students of all races except for Asian-Americans, whose achievement increased.

ACT was one of the myriad organizations that profited from supporting Common Core despite its lack of success for children and taxpayers. Its employees helped develop Common Core and the organization has received millions in taxpayer dollars to help create Common Core tests.

“ACT is one of the best barometers of student progress, and our college-bound kids are doing worse than they have in the ACT’s history,” said Center for Education Reform CEO Jeanne Allen in a statement.

These recent results are not anomalies, but the latest in a repeated series of achievement declines on various measuring sticks since Common Core was enacted. This is the opposite of what we were told would happen with trillions of taxpayer dollars and an entire generation of children who deserve not to have been guinea pigs in a failed national experiment.

Perhaps the top stated goal of Common Core was to increase American kids’ “college and career readiness.” The phrase is so central to Common Core’s branding that it is part of the mandates’ formal title for its English “anchor standards” and appears 60 times in the English requirements alone. Yet all the evidence since Common Core was shoved into schools, just as critics argued, shows that it has at best done nothing to improve students’ “college and career readiness,” and at worst has damaged it.

While of course many factors go into student achievement, it’s very clear from the available information that U.S. teachers and schools worked hard to do what Common Core demanded and that, regardless, their efforts have not yielded good results. A 2016 survey, for example, found “more than three quarters of teachers (76%) reported having changed at least half of their classroom instruction as a result of [Common Core]; almost one fifth (19%) reported having changed almost all of it.”

An October poll of registered voters across the country found 52 percent think their local public schools are “excellent” or “good,” although 55 percent thought the U.S. public school system as a whole is either just “fair” or “poor.” Things are a lot worse on both fronts than most Americans are willing to realize.

Compared to the rest of the world, even the United States’ top school districts only generate average student achievement, according to the Global Report Card. Common Core was touted as the solution to several decades of lackluster student performance like this that have deprived our economy of trillions in economic growth and would lift millions of Americans out of poverty. That was when U.S. test scores, while mediocre and reflecting huge levels of functional illiteracy, were better than they are now.

It is thus still the case, as it was when the Coleman Report was released 53 years ago, that U.S. public schools do not lift children above the conditions of their home lives. They add nothing to what children already do or do not get from at home, when we know from the track record of the distressingly few excellent schools that this is absolutely possible and therefore should be non-negotiably required. But because the people in charge of U.S. education not only neither lose power nor credibility but actually profit when American kids fail, we can only expect things to get worse.

Wednesday, October 30, 2019

Rachel McKinnon Is a Cheat and a Bully


By Madeleine Kearns
Tuesday, October 29, 2019

Rachel McKinnon — the so-called defending “world champion” of women’s track cycling — is a man. I’ll repeat that so my meaning cannot be misconstrued. He is a man.

Maybe my kind-hearted reader is offended by this blunt phrasing. Why am I calling McKinnon a man — when, perhaps for complicated reasons, he would rather be called a woman? Why don’t I compromise and call him a “trans woman,” as others do? Or be polite and address him by “she/her” pronouns, like everyone else in the media?

Well, I’ll tell you why, since you asked. This is precisely the well-meant, tragically naïve logic that has enabled a structure of lies and tyranny to be erected around us, a structure that most cannot opt out of without incurring an enormous social cost. It is a structure in which cheating and viciousness are rewarded while civility and truth-telling are punished. Rachel McKinnon is the perfect example of how this structure works and operates, as well as why we should resist it.

For context: McKinnon lived unambiguously as a man (called “Rhys”) until the age of 29. In addition to male puberty, he has had a full experience of modern academia where he developed a particular enthusiasm for the philosophy of lies (literally) and for “gender studies.” Graduating first from the University of Victoria in British Columbia, he completed a Ph.D. from the University of Waterloo with a thesis on assertions, “Why You Don’t Need to Know What You’re Talking About” (the literal subtitle). Later, he published a book on this subject titled The Norms of Assertion: Truth, Lies and Warrant, in which he argues “that in some special contexts, we can lie.” Which contexts might those be?

While serving as an associate professor at the College of Charleston, S.C., McKinnon decided to get into sport cycling. (Fair.) He won the 200-meter sprint record for women in the 35–39 range in 2018, and then the UCI Masters World Track Cycling Championship in the Women’s Sprint. (Not fair.)

This month, he defended his title. From the news last week: “Rachel McKinnon successfully defended her track World Championship title in Manchester,” per Cycling Weekly;  “Prominent trans rights campaigner McKinnon has defended her right to compete,” per the BBC; “[McKinnon] found herself defending her title against a critic — the president’s son,” per CBS News; “McKinnon keeps dominating women’s cycling. And she keeps creating controversy all the way,” per the New York Post.

So, what’s this got to do with the culture at large? First, by pretending that McKinnon is not a man, we have allowed him to cheat at sports at the expense of his female competitors. Because McKinnon being a man is directly relevant to the argument that he should not compete against women, in calling him something other than a man, we obfuscate that argument — and all for the sake of a very recently invented set of blasphemy norms (e.g. “misgendering” and “deadnaming”) that don’t apply to us non-believers.

Second, by pretending that McKinnon is not a man — but rather a vulnerable woman — we have forsworn all expectations of accountability and decency. The most egregious example of this, and the precise moment I decided to stop lending McKinnon special courtesies, was when he lauded the terminal illness of a young woman, Magdalen Berns, whom I held (and still hold) in great esteem.

Berns believed strongly that men cannot be women. As she lay on her deathbed in Scotland, at the age of 36, surrounded by her loved ones, McKinnon tweeted that he was “happy” when bad people died, that this feeling is “justified,” that Berns is a “trash human,” and further advised his followers “don’t be the sort of person who people you’ve harmed are happy you’re dying of brain cancer.” By contrast, here is a characteristically civil, clear and courageous quote from Berns: “it’s not hate to defend your rights and it’s not hate to speak the truth.”

Men can be so rude sometimes.

Other women have tried to articulate similar sentiments with regards to McKinnon. Take Jen Wagner-Assali, who, after coming in third to McKinnon at the UCI Masters Track World Championship in 2018, tweeted: “it’s definitely NOT fair.” After being bullied, Wagner apologized to McKinnon for causing offense. But that wasn’t enough, as McKinnon explained. “The apology is not accepted: she still thinks what she said. She merely apologizes for being caught saying it publicly.”

She still thinks it’s not fair for a man to beat her in the women’s category? Just imagine!

McKinnon then lashed out at the tennis star and longtime defender of sexual minorities, Martina Navratilova, who wrote in the Sunday Times of London that to allow men to compete against women was to permit “cheating.” Already, trans athletes had “achieved honours as women that were beyond their capabilities as men,” Navratilova argued, worrying that other women would also be “cowed into silence or submission.” McKinnon called Navratilova a “transphobe,” and demanded that she apologize.

Evidently, it’s not only sportswomen McKinnon has issues with. It’s journalists and women’s-rights campaigners, too. When a spokesperson from Fair Play for Women was invited by the BBC to discuss Navratilova’s comments, McKinnon wrote on Twitter: “I will not participate in a discussion panel that takes them seriously and gives them a platform.” The BBC subsequently disinvited them.

McKinnon was strikingly rude and sneering to Abigail Shrier, a gentle writer for the Wall Street Journal, when she appeared on Fox Nation with him to discuss women’s sports. As well as baselessly calling Shrier a “transphobe,” McKinnon has tweeted that others who disagree with him ought to “die in a grease fire,” a comment which resulted in a temporary suspension from the platform, much to his irritation.

So, can you compromise or appease a tyrant? You can certainly try. In a surprisingly balanced interview with Sky News — in which the interviewer explained that the science shows that even after taking testosterone suppressants, men retain indisputable physiological advantages that are especially pronounced in a sport like track — McKinnon explained why he thinks skeptics like me, who consider the science of sex, are wrong:

I’m legally and medically female. But the people who oppose my existence still want to think of me as male. They use the language that I am a man . . . If you think of trans women as men then you think there’s an unfair advantage.

Of course, nobody is questioning McKinnon’s existence — for how could the continually aggressive presence of such an unpleasant man be denied? What is being disputed is his belief that he is a woman and his sense of entitlement to compete against actual women. But for those who might be more sympathetic, or for those who don’t know quite how much of a thug he is, he makes the classic cartoon-villain mistake: overreach. Those who are not with him entirely, he explains, must be entirely against him:

[Sport] is central to society. So, if you want to say, “I believe you’re a woman for all of society except this massive central part of sport” then that’s not fair. So, fairness is the inclusion of trans women.

As it happens, I do not have an ideological commitment to gender terminology or pronouns one way or another. For struggling, respectful souls, I’m happy to lend special courtesies (in fact, I frequently do). But for cheats and liars, for bullies and tyrants, for those who seek to use my words to propagate deceit and injustice? Oh, just drop it, sir — I’ll never call you “ma’am.”

Elizabeth Warren’s Untenable Plans


By Michael Tanner
Wednesday, October 30, 2019

If you’ve been having trouble finding someone to walk your dog, don’t worry. Any day now, Elizabeth Warren will announce “a plan for that.” It will undoubtedly be comprehensive, detailed, and replete with subsidies for lower- and middle-class dog walkers and underserved breeds. It will cost tens of billions of dollars and will receive widespread positive notice from the media. However, to judge by her other recent plans, the one thing it won’t include is any discussion of how she plans to pay for it.

The Massachusetts senator has challenged and possibly overtaken former vice president Joe Biden as the front-runner for the Democratic nomination, largely based on having a plan for the government to tackle every problem facing this country, no matter how big or how small, from issues with military housing to Puerto Rican debt to climate change.

The price tag for this massive expansion of government is enormous. Much of the attention in recent weeks has been focused on Warren’s embrace of Medicare for All, which she refuses to admit would require an increase in middle-class taxes. Even Vermont senator Bernie Sanders has conceded that such proposals, which would cost $30–40 trillion over 10 years, cannot be financed without tax hikes. Warren’s refusal to address this obvious fact makes her look less like a would-be policy wonk and more like a typical politician.

But even setting aside Medicare for All, Warren’s plans are likely to dump oceans of red ink onto our growing national debt. Her non-health-care spending proposals already total some $7.5 trillion per year over the next 10 years. Although these are not quite Bernie levels of government largesse, her proposals would still require nearly double our current levels of spending.

To pay for all this, Warren proposes a variety of tax hikes, mostly designed to hit corporations or high-earners: higher payroll taxes for those earning more than $250,000 per year; a 7 percent profits tax on companies earning more than $100 million; a 60 percent lobbying tax on firms that spend a million or more on lobbying, and so forth. But the biggest chunk of money would come from Warren’s proposed “wealth tax,” a 2 to 3 percent levy on net worth above $50 million. Warren estimates that this wealth tax will pull in more than $2.75 trillion over ten years. It won’t.

First, there is the slight problem that a wealth tax is probably unconstitutional. Of course, constitutional constraints are quaint notions in the Age of Trump. Regardless, it is worth noting that the Constitution permits the federal government to impose only “direct taxes,” such as a property tax. That’s why it required a constitutional amendment to enact the federal income tax. Many constitutional scholars warn that a wealth tax is neither a direct tax nor income tax.

Even if Warren can find a way around the constitutional guardrails — perhaps by something such as a retrospective wealth tax in which you wait until a taxpayer sells assets or passes away — a wealth tax is unlikely to raise anywhere near the amount of money she predicts.

Simply look at Europe’s experiments with wealth taxes. At one time, a dozen European countries imposed wealth taxes. Today, all but three have abandoned those levies. Among those repealing their wealth tax are the Scandinavian social democracies that Warren admires, Denmark, Finland, and Sweden. Norway retains a wealth tax but has significantly reduced it in recent years. Additional countries abandoning the tax include Austria, France, Germany, Iceland, Ireland, Luxembourg, and the Netherlands. Other countries, such as Great Britain, have considered wealth taxes and rejected them.

They did so because wealth taxes are administratively complex and difficult to enforce. Also, they significantly reduce investment, entrepreneurship, and, ultimately, economic growth. According to the Organization for Economic Cooperation and Development, European wealth taxes raised, on average, only about 0.2 percent of GDP in revenues. By comparison, the U.S. federal income tax raises 8 percent of GDP.

Two groups, however, would benefit substantially from a wealth tax. The tax would be a full-employment opportunity for the tax-preparation industry and for lawyers. After all, we would now have to determine fair market value for everything from homes and vehicles to artwork and jewelry to family pension rights and intellectual property. The other big winner would be lobbyists, who could be expected to descend on Washington en masse seeking exemptions and exceptions for their clients. If you think the tax code is a mess today, just wait until D.C. is done with Warren’s plan.

There is an old Yiddish proverb that goes “Mann tracht, un Gott Lacht,” or “Man plans, and God laughs.” It is all well and good that Senator Warren has a plan for everything. But until she actually figures out how to pay for everything without crippling our economy, such plans really don’t add up.

Judge To Reconsider Covington Catholic Teen’s Defamation Lawsuit Against Washington Post


By Margot Cleveland
Tuesday, October 29, 2019

The Washington Post has had a rough couple of days. First, the “Democracy dies in the dark” outlet spent its Sunday apologizing for the glowing obituary lede it gave the world’s most dangerous terrorist, Abu Bakr al-Baghdadi, “an austere religious scholar” who murdered himself and three children by suicide vest when cornered by U.S. forces. Then yesterday, the Post saw the victory it had achieved earlier this year in the defamation suit brought by Covington Catholic teen Nick Sandmann evaporate.

Sandmann had sued the Washington Post after the Post and a bevy of other media outlets cast him “as a smirking MAGA-hat-wearing racist who had blocked Native American elder Nathan Phillips’s path” to the Lincoln Memorial,” I reported in July. At the time, Sandmann and his Covington Catholic High School classmates were waiting for their bus after having attended the March for Life in D.C. A short clip of the incident “captured Phillips playing his drum and singing in the center of the group of Covington Catholic students,” and Phillips told reporters “students had swarmed him as he attempted to make his way up to the Lincoln Memorial.”


The Washington Post ran Phillips’s tale and either linked to or referenced the video snip that had since gone viral. Later, however, full videos came to light that captured the entire incident, and those “showed that Phillips had not attempted to make his way to the Lincoln Memorial, but had instead marched into the group of students and stood in front of Sandmann, beating his drum and singing.”

While the Post and other media outlets quickly issued corrections, the teenage Sandmann had already been “branded a smirking racist and rendered a subject of scorn throughout the country.”

Sandmann proceeded to sue the Washington Post, and others who ran Phillips’ false claims, for defamation. His suit against the Post proceeded in a federal district court in the young man’s home state of Kentucky. In July, Judge William Bertelsman, a semi-retired Jimmy Carter appointee, tossed Sandmann’s case.

In a 30-page opinion, Judge Bertelsman held that, as a matter of law, Sandmann could not prevail against the Washington Post on his defamation claim. The court reasoned that Sandmann’s claim failed because the Post’s articles were not statements “concerning” Sandmann, as they referenced the group of students or did not include a name or picture of Sandmann. He further ruled that the Post’s reporting of the incident was “not factual” because they could not be proven “objectively incorrect.”


Judge Bertelsman’s original opinion was ripe for reversal by Sixth Circuit Court of Appeals. But Sandmann’s attorneys, Todd McMurtry and Lin Wood, opted against appealing immediately, and instead asked Judge Bertelsman to reconsider his decision. Sandmann’s lawyers also requested permission to file an amended complaint.

Yesterday, over the Washington Post’s opposition, the court granted both motions, set aside its earlier decision dismissing Sandmann’s defamation case and accepted Sandmann’s amended complaint. In reinstating Sandmann’s defamation case, the court stressed that the amended complaint relied on the same 33 statements the Post had made in the original complaint.

Judge Bertelsman then noted that after giving “this matter careful review,” he had decided that Sandmann sufficiently alleged a claim for defamation against the Post based on the statements identified as “Statements 10, 11, and 33,” “to the extent that these three statements state that plaintiff ‘blocked’ Nathan Phillips and ‘would not allow him to retreat.’”

All three statements consisted of the Washington Post repeating Phillips’s fabled encounter with Sandmann. In what was identified as Statement 10, the Washington Post wrote: “It was getting ugly, and I was thinking: ‘I’ve got to find myself an exit out of this situation and finish my song at the Lincoln Memorial,’ Phillips recalled. ‘I started going that way, and that guy in the hat stood in my way and we were at an impasse. He just blocked my way and wouldn’t allow me to retreat.”


Statement 11 consisted of the Post publishing Phillips’s recounting of the event as such: “A few of the young people chanted ‘Build that wall, build that wall,’ the man said, adding that a teen, shown smirking at him in the video, was blocking him from moving.” The final statement, Statement 33, also quoted Phillips—as well as misstating his war record—“Phillips, who fought in the Vietnam War, says in an interview ‘I started going that way, and that guy in the hat stood in my way and we were at an impasse. He just blocked my way and wouldn’t allow me to retreat.’”

In ruling that these statements were sufficient to allow Sandmann’s case to move forward, the court also noted that in his amended complaint, Sandmann alleged in great detail “that Phillips deliberately lied concerning the events at issue,” and that but for the Washington Post’s negligence or malice, they would have realized as much.

Sandmann’s case is far from over. Now, the parties must engage in the costly and time-consuming process of discovery. And that’s where things could get really interesting—when the Post is forced to hand over internal communications concerning its reporting on the MAGA-hat-wearing Sandmann and his Catholic classmates who had just attended the March for Life.

For now, though, Nick’s father, Ted Sandmann, is just grateful Judge Bertelsman took the time to reconsider his ruling. “We are invigorated and united in our purpose to achieve justice for Nicholas and the other Covington Catholic boys so slandered by the Washington Post and other news organizations,” the senior Sandmann told me. “This is a huge win for justice and for the Sandmann family,” attorney Todd McMurtry noted, adding that “Lin Wood and I are united in our desire to obtain justice for Nicholas.”

Trump’s Best Option for Avoiding Impeachment: An Apology


By Jonah Goldberg
Wednesday, October 30, 2019

In l’affaire Ukraine, the president is guilty as charged. And the best strategy for him to avoid impeachment by the House and perhaps even removal by the Senate is to admit it, apologize, and let voters make their own judgment. It’s also the best way to fend off a disaster for Senate Republicans.

The president is accused — politically, not criminally — of trying to force the Ukrainian president to tar former vice president Joe Biden with an investigation into his alleged “corruption” in exchange for the release of military aid and a meeting in the Oval Office. I believe a plain reading of the rough transcript of a phone call between Trump and Ukrainian president Volodymyr Zelensky supports the charge. So does testimony from the top American diplomat in Ukraine, William Taylor, as well as several other Trump appointees and aides, including Tuesday’s testimony from Alexander Vindman, a National Security Council staffer who listened to the phone call. There’s still due diligence to be done, but it seems implausible they’re all lying.

Common sense also works against the president. If Trump were sincerely concerned about Ukrainian corruption, why has he never expressed similar concerns about corruption anywhere else? And, why, if the issue is Ukrainian corruption generally, did the Trump administration focus on the alleged corruption of a single Ukrainian firm, Burisma, where Biden’s son sat on the board?

The most plausible explanation is twofold. First, the corruption issue was a pretext; under the law, corruption concerns are the only justification for blocking congressionally approved aid. Second, Trump’s real goal was to bruise Biden. Indeed, according to Taylor, the White House said it would settle for a mere statement about Biden’s potential corruption — meaning Trump cared more about political gain than about an actual investigation.

Trump and his defenders are still pounding on outdated, unpersuasive, or irrelevant talking points. They rail about the identity and motives of the whistleblower who first aired these allegations, even though the whistleblower’s report has been largely corroborated by others. They claim that the process of the Democratic inquiry in the House is unconstitutional, which is ridiculous. They insist that hearings where Republicans can cross-examine witnesses are a “star chamber” or reminiscent of secret Soviet trials. Also ridiculous.

Republican complaints about the heavy-handed tactics of the Democrats have some merit, but they’ll be rendered moot when the Democrats move to public hearings or to a Senate trial. And when that happens, claims that the call was “perfect” and that there was no quid pro quo will evaporate in the face of the facts.

This is why the smartest Trump defenders are counseling the president to simply admit the obvious: There was a quid pro quo, and the president’s phone call fell short of perfection, but nothing he did is an impeachable offense.

As former federal prosecutor (and my old National Review colleague) Andrew McCarthy argues, by insisting there was no quid pro quo, the president made things much easier for Democrats. The implicit concession in Trump’s position is that if the charges were true, they would be impeachable. That is a burden of proof that no doubt warms the cockles of Adam Schiff’s heart. The smarter course is to admit it happened but, as McCarthy writes, “no harm no foul.”

I’d go one step further. Rather than take the Mick Mulvaney line and shout “get over it” — now a Trump-campaign T-shirt — the president should apologize. Trump’s refusal to admit any wrongdoing imperils GOP senators who are already reluctant to defend him on the merits. Once the process complaints expire, they’ll be left with no defense at all. Bill Clinton fended off removal in the Senate in no small part because he admitted wrongdoing and asked the country for forgiveness. Once he did that, he and his supporters were liberated to say the country should “move on.” It’s worth recalling that the first existential crisis of Trump’s 2016 campaign — his talk about groping women on the Access Hollywood tape — was averted by the first, and last, meaningful apology anyone can remember from him.

I disagree with those who say that the allegations against Trump are not impeachable. But, politically, apologizing could forestall impeachment by giving politicians and voters a safe harbor: “It was wrong, but he said he’s sorry. Move on.” The longer the president defends a lie, the more Americans will resent being lied to.

Of course, contrition doesn’t come easy for Trump and would be embarrassing for him and his media cheerleaders. But it would also give Republican candidates a rationale for opposing impeachment that they could sell.

Trump is fond of demanding ever more loyalty from Republicans. But loyalty is a two-way street. If he thinks they should defend him, he should give them something defensible to work with.

Tuesday, October 29, 2019

The Difficulty of Brexit Is the Case for Brexit


By Kevin D. Williamson
Tuesday, October 29, 2019

The difficulty of Brexit is the case for Brexit.

The argument against the continued submersion of the United Kingdom within the European Union is that it would constitute an indenture on British sovereignty, liberty, and democracy. That is obviously — obvious now, at least — a fact, and one need not take it as a moral judgment upon Brussels or the European project to acknowledge the fact.

The European Union indentures British sovereignty in the same way it indentures the sovereignty of its other member states, although the United Kingdom wisely arranged to keep control over its own monetary policy. The merging of sovereignties, which necessitates the subordinating of sovereignties, is the point of the European Union, its raison d’être. Again, that need not be understood as nefarious. All international agreements, whether bilateral or multilateral, do that.

But compare the British experience in the European Union with the U.S. experience in NAFTA. A European Union that was a kind of grand Continental NAFTA would have been preferable to and much more practical than a political union creating a half-baked United States of Europe, but set that aside for the moment. When the United States decided that NAFTA as it was no longer served U.S. interests, it had the choice of opening the treaty up for renegotiation or exiting it. (The Trump administration, demonstrating its usual discipline, has so far failed to secure the ratification of USMCA, the successor to NAFTA.) The other NAFTA countries enjoyed the same right of exit. Maintaining the right of exit is the difference between using sovereignty and losing sovereignty.

When the people of the United Kingdom voted to leave the European Union — rightly or wrongly, intelligently or meat-headedly, however you see it — that should have been that. But British sovereignty has become so entangled in European protocols as to render Brexit difficult if not quite impossible without the cooperation of the European Union itself. And that cooperation has not been exactly forthcoming. Brussels has worked to make Brexit as difficult, painful, and expensive for the United Kingdom as it can. It has even gone so far as to demand the creation of what amounts to a national border for goods within the United Kingdom, in effect permanently ceding a portion of British sovereignty to the European Union.

And that is the way in which the European Union indentures British liberty and democracy. There is more to liberty than simple unrestricted freedom of action. Liberty includes rights embedded in a particular political regime and legal context, meaning liberty under British government and British law of British making. The British people might legitimately have chosen another course of action — but they did not. And while majoritarian democracy is an instrument of limited legitimacy and applicability (which is why we Americans have a Bill of Rights — “unalienable rights” cannot be voted away), when a question is put to the people, either the result must stand or the people must conclude that they no longer enjoy sovereignty, liberty, or democracy.

The European Union is not the sort of repressive machine that might be described as “Orwellian.” It is more Kafkaesque, a Hotel California of a superstate whose constituents can check out any time they like but — ask Boris Johnson — they can never leave. That it is the British government currently begging Brussels for a delay does not change the character of that relationship, though of course the incompetence and stupidity of the British government must be taken into account.

The United Kingdom is now set to go through another election in an attempt to settle a question that should have been, in principle, settled by an election back in 2016. That the United Kingdom is finding Brexit so difficult to get done is the best argument there is for getting it done.

California Can’t Keep the Lights On


By Rich Lowry
Tuesday, October 29, 2019

California is staying true to its reputation as the land of innovation — it is making blackouts, heretofore the signature of impoverished and war-torn lands, a routine feature of 21st-century American life.

More than 2 million people are going without power in Northern and Central California, in the latest and biggest of the intentional blackouts that are, astonishingly, California’s best answer to the risk of runaway wildfires.

Power — and all the goods it makes possible — is synonymous with modern civilization. It shouldn’t be a negotiable for anyone living in a well-functioning society, or even in California, which, despite its stupendous wealth and natural splendor, has blighted itself over the decades with misgovernance and misplaced priorities.

The same California that has been the seedbed of world-famous companies that make it possible for people to send widely viewed short missives of 280 characters or less, and share and like images of grumpy cats, isn’t doing so well at keeping the lights on.

The same California that has boldly committed to transitioning to 50 percent renewable energy by 2025 — and 100 percent renewable energy by 2045 — can’t manage its existing energy infrastructure.

The same California that has pushed its electricity rates to the highest in the contiguous United States through its mandates and regulations doesn’t provide continuous access to that overpriced electricity.

California governor Gavin Newsom, who has to try to evade responsibility for this debacle while presiding over it, blames “dog-eat-dog capitalism” for the state’s current crisis. It sounds like he’s referring to robber barons who have descended on the state to suck it dry of profits while burning it to the ground. But Newsom is talking about one of the most regulated industries in the state — namely California’s energy utilities, which answer to the state’s public utilities commission.

This is not exactly an Ayn Rand operation. The state could have, if it wanted, pushed the utilities to focus on the resilience and safety of its current infrastructure — implicated in some of the state’s most fearsome recent fires — as a top priority. Instead, the commission forced costly renewable-energy initiatives on the utilities. Who cares about something as mundane as properly maintained power lines if something as supposedly epically important — and politically fashionable — as saving the planet is at stake?

Meanwhile, California has had a decades-long aversion to properly clearing forests. The state’s leaders have long been in thrall to the belief that cutting down trees is somehow an offense against nature, even though thinning helps create healthier forests. Biomass has been allowed to build up, and it becomes the kindling for catastrophic fires.

As Chuck DeVore of the Texas Public Policy Foundation points out, a report of the Western Governors’ Association warned of this effect more than a decade ago, noting that “over time the fire-prone forests that were not thinned, burn in uncharacteristically destructive wildfires.”

In 2016, then-governor Jerry Brown actually vetoed a bill that had unanimously passed the state legislature to promote the clearing of trees dangerously close to power lines. Brown’s team says this legislation was no big deal, but one progressive watchdog called the bill “neither insignificant or small.”

On top of all this, more people live in remote areas susceptible to fires, in part because of the high cost of housing in more built-up areas.

There shouldn’t be any doubt that California, susceptible to drought through its history and whipped by fierce, dry winds this time of year, is always going to have a fire problem. But there also shouldn’t be any doubt that dealing with it this poorly is the result of a series of foolish, unrealistic policy choices.

California’s overriding goal should have been safe, cheap, and reliable power, a public good so basic that it’s easy to take for granted. The state’s focus on ideological fantasies has instead ensured it has none of the above.

The First Freedom Fades


By Ramesh Ponnuru
Thursday, October 24, 2019

A lot of Democrats were annoyed by a comment Beto O’Rourke made at a CNN forum on gay rights. Asked whether churches that “oppose same-sex marriage” should lose their tax-exempt status, the former congressman and current presidential-race asterisk said, “Yes. There can be no reward, no benefit, no tax break, for anyone, any institution, any organization in America, that denies the full human rights, that denies the full civil rights, of everyone in America. So as president, we’re going to make that a priority. And we are going to stop those who are infringing upon the human rights of our fellow Americans.”

Two other candidates for the Democratic nomination, Senator Elizabeth Warren and South Bend mayor Pete Buttigieg, rejected this idea. Legal analysts said that the Supreme Court’s current jurisprudence blocks governments from discriminating among churches on the basis of their doctrines. Some liberal commentators criticized O’Rourke for handing Republicans a talking point. Jordan Weissman wrote in Slate, “He is turning himself into a walking straw man, the non-fringe guy Republicans can reliably point to when they want to say, ‘See, the libs really do want to take your guns and shut down your churches’” (emphasis in original).

O’Rourke himself backed down in part, with his campaign explaining that the tax exemption would be revoked only for institutions that fired or refused to hire people because they are married to someone of the same sex.

But O’Rourke’s comment did not come out of nowhere. During the high court’s consideration of whether the Constitution commands same-sex marriage, Justice Samuel Alito asked whether an affirmative answer would lead to an end to tax exemptions for opponents. The Obama administration’s lawyer arguing the case allowed that it might.

Over the last generation, progressives have become much more hostile to claims of religious liberty and conscience rights. In 1993, Congress passed the Religious Freedom Restoration Act nearly by acclamation. Under the law, people whose ability to exercise their religion was substantially burdened by a government policy could ask for exemptions. Judges would deny them only if applying the policy to them was the least restrictive means of advancing a compelling governmental interest. Now, however, Democrats generally want to narrow the 1993 law.

Liberal intellectuals have coalesced behind the view that conservatives are now abusing claims of religious liberty. They have advanced two principal arguments. First, they say that the law was originally meant to protect minorities such as Jehovah’s Witnesses and Quakers, not extremely large groups, such as Catholics, that had lost political struggles. Second, they say that religious freedom should be a “shield” and not a “sword”: not, that is, a way religious believers can inflict harms on other people. Letting religious employers refrain from offering their employees insurance plans that cover contraception, or some forms of contraception, is on this theory an abuse. So is letting religious vendors of wedding services — florists, bakers, photographers, wedding-hall owners — decline to be involved in same-sex unions.

Whatever else may be said for them, both arguments are departures from a long American tradition of religious pluralism. Religious exemptions have historically been granted for large religious groups: The Volstead Act implementing Prohibition exempted the sacramental use of alcohol, such as in Catholic Masses. Religious objections to wars have long been honored, too, even though in practice they have meant that other people have had to risk their lives in the dissenters’ place, which is at least as large a harm as having to find a different photographer.

Nevertheless, the American Civil Liberties Union has switched its position on the religious-freedom law and so have many Democrats. Many of their presidential candidates, including Senator Warren and Mayor Buttigieg as well as O’Rourke, have endorsed the Equality Act, which forbids discrimination on the basis of sexual orientation or transgender status and specifically denies religious believers the ability to ask for an exemption. It would be the first such exemption enacted since the 1993 law. Many of the candidates have also endorsed the Do No Harm Act, which would weaken that law by blocking any religious exemptions that would cause “meaningful harm, including dignitary harm,” to anyone.

Around the same time that O’Rourke made his provocative comment, Attorney General Bill Barr was causing a different kind of religio-political controversy. In a speech at the University of Notre Dame, he argued that the Founders thought religious belief was crucial to the health of a free society, that the rise of secularism was causing baleful social consequences, including an increased number of suicides and fatal drug overdoses, and that secularist intolerance was a threat to religious freedom.

Barr painted with a broad brush. Critics could justly have pointed out that his comment that “the Founding generation were Christians” is an overstatement, or noted that some social trends, including a long-running reduction in violent crime, tell against his gloom, or challenged him to be more precise in defining “secularism.”

Instead they went nuts. Barr described the entertainment industry and academia as engaged in “an unremitting assault on religion and traditional values.” Chicago Tribune columnist Steve Chapman said that the line “comes close to denying the rights of nonbelievers to express their disbelief.” That’s true only if Chapman’s criticism of Barr comes close to denying religious people their right to free speech, which is to say it’s false. Liberal legal commentator Dahlia Lithwick went beyond Chapman, taking Barr to have engaged in coded anti-Semitism.

Catherine Rampell wrote in the Washington Post that the speech was “terrifying” and that Barr had come out for a “state establishment of religion.” Jeffrey Toobin, with customary restraint, told readers of The New Yorker that Barr had given “the worst speech by an Attorney General of the United States in modern history.”

Toobin took aim at Barr’s claim that the Framers had thought free government was suitable only for a religious people. False, said Toobin: They thought it was suitable for nonbelievers, too. Toobin cited Justice Hugo Black, who wrote in 1961 that the government could not compel belief or disbelief, or attempt to “aid all religions as against nonbelievers.” Here Toobin has built a Jenga tower of non sequiturs. Barr didn’t say that government should attempt to compel religious belief or that the Constitution allows it; he didn’t contradict Black. Toobin himself didn’t actually contradict the proposition Barr attributed to the Framers: Whether a free society can include nonbelievers is a different question from whether it can be dominated by them.

As it happens, the relationship between an individual’s belief and society is the most important topic Barr got wrong. He said that “Christianity teaches a micro-morality” while the “new secular religion teaches macro-morality.” The former focuses on “our own personal morality and transformation,” the latter on “commitment to political causes and collective action to address social problems.”

This distinction does not hold up. The left-wing mindset Barr has in mind can be extremely prescriptive about individuals’ use of language, consumption choices, and associations. And Christianity asks individuals to participate in fighting social evils. Christians involved in the abolitionist, civil-rights, and pro-life movements were all told that they should pray quietly at home and church, and all rejected that misunderstanding of their faith.

This was a false step in Barr’s argument. He does not really think of Christianity in this overly individualistic way, which is why his speech also defended “laws that reflect traditional moral norms” against abortion and euthanasia. The aggregated effects of individuals’ religious and moral beliefs were more or less his theme.

Which brings us back to O’Rourke. It is not quite right to say that the Supreme Court requires the government to be neutral among religious institutions. It has allowed the revocation of a tax exemption for a university that banned interracial dating on religious grounds. (That’s why Justice Alito asked his question.) If the public comes not only to favor same-sex marriage but to regard opposition to it as akin to racial bigotry, then religious freedom will not long survive for the holdouts. In this discrete matter, Barr is undoubtedly correct to posit that traditional religious belief is a crucial protection for freedom.

He is likely correct in a broader way as well. Our sense that religious freedom is worth protecting at all is based on the understanding that being in the right relation with God, if God exists, is important. Without that understanding, there is no need to specify a freedom to go to synagogue rather than to affirm a freedom to enter and exit buildings generally. A society that has broadly lost this sense of the divine will not grasp the case for religious freedom and will greet its advocates with incomprehension — as O’Rourke, and Barr’s critics, are illustrating.

Monday, October 28, 2019

Jihadist Terrorist Abu Bakr al-Baghdadi Is Dead


By Jim Geraghty
Monday, October 28, 2019

The world is a safer place today than it was just a few days ago. On Saturday night, U.S. special forces killed ISIS leader Abu Bakr al-Baghdadi during a night raid in the northwestern Syrian province of Idlib. Take a moment to appreciate not only this spectacular mission, but how severe the threat of Baghdadi and ISIS was, and how our military and our allies managed to shut down a kingdom of horrors and smash an army of cruelty.

Back in March 2014, Graeme Wood wrote a lengthy article in The Atlantic that, at the time, was one of the most detailed and extensively researched portraits of ISIS, what fueled its rise, what attracted its members, and what its leadership wanted. A key part of Wood’s profile was laying out how this particular group of Islamists differed from the group of Islamists Americans were already most familiar with, al-Qaeda. After the U.S. Navy SEALS took out Osama bin Laden in 2011, al-Qaeda gradually faded from the list of prominent worries of the average American. The last major al-Qaeda attack on western targets was the Charlie Hebdo shooting on January 7, 2015.

One of the surprising conclusions from Wood — and perhaps one that other terrorism experts might dispute — was that ISIS was less focused on the West than al-Qaeda.

. . . its threat to the United States is smaller than its all too frequent conflation with al-Qaeda would suggest. Al-Qaeda’s core is rare among jihadist groups for its focus on the “far enemy” (the West); most jihadist groups’ main concerns lie closer to home. That’s especially true of the Islamic State, precisely because of its ideology. It sees enemies everywhere around it, and while its leadership wishes ill on the United States, the application of Sharia in the caliphate and the expansion to contiguous lands are paramount. Baghdadi has said as much directly: in November he told his Saudi agents to “deal with the rafida [Shia] first . . . then al-Sulul [Sunni supporters of the Saudi monarchy] . . . before the crusaders and their bases.”

Nonetheless, ISIS repeatedly demonstrated an ability inspire jihadist-minded Muslims to attempt deadly attacks wherever they lived, and this inspiration created a pervasive threat to civilian targets around the world. The list of targets is stunning, even after living through it: the Canadian parliament, the train from Paris to Amsterdam, the Bataclan theater, San Bernardino, a Starbucks in Jakarta, a tourist intersection in Istanbul, Brussels metro stations and airports, Ataturk International Airport in Istanbul, Bastille Day in Nice, France, the Pulse nightclub in Orlando, a church in Normandy . . . ISIS never launched any attack as deadly as the 9/11 attacks, but it set its sights lower and was arguably more effective: it created a sense that it could hit anywhere, not just prominent landmarks. (Many would argue this approach to terrorism inspires even more fear. You can choose to avoid airplanes or the tallest skyscrapers and government buildings; it’s much more difficult to avoid any public space or public transportation.)

Unlike al-Qaeda, ISIS could point to a territory and a spectacularly cruel and brutal government, expanding its territory and conquering new peoples. ISIS argued it was the fulfillment of an ancient promise to Muslims, and that history and the divine were on its side. It represented a threat unlike any other in American history: a hostile state that was comparably technologically primitive but repeatedly demonstrated an ability to kill our civilians in unpredictable ways, often by turning our own legal immigrants and citizens against ourselves. (In a reflection of how our political divisions were starting to consume us, a significant portion of the public refused to believe that an ISIS attack was an ISIS attack, and that it simply had to be primarily driven by homophobia.)

Wood wrote:

If [ISIS] loses its grip on its territory in Syria and Iraq, it will cease to be a caliphate. Caliphates cannot exist as underground movements, because territorial authority is a requirement: take away its command of territory, and all those oaths of allegiance are no longer binding. Former pledges could of course continue to attack the West and behead their enemies, as freelancers. But the propaganda value of the caliphate would disappear, and with it the supposed religious duty to immigrate and serve it. If the United States were to invade, the Islamic State’s obsession with battle at Dabiq suggests that it might send vast resources there, as if in a conventional battle. If the state musters at Dabiq in full force, only to be routed, it might never recover.

The United States did not invade but put together a coalition of most of our NATO allies, Jordan, Morocco, Turkey (although there’s a lot to unpack there) — and perhaps most importantly, the Iraqi Army and the Syrian Democratic Forces who had to do most of the fighting on the ground. (On paper, Russia, Iran, Iraq, and the Syrian government formed their own coalition against ISIS, but somehow their bombs kept landing on rebels fighting against Assad’s regime.)

ISIS isn’t dead and gone, but it’s a shadow of its former self.  Jacob Olidort, special adviser on Middle East policy and Syria country director at the Defense Department in 2016 and 2017, wrote earlier this year that the president and his critics were talking past each other, that while ISIS will have members and followers for a long time to come, it no longer functions as a coherent organization:

New fissures within the group have opened over the past two years, with grievances ranging from issues of authenticity and ideological purity to organizational and bureaucratic failures. The Islamic State’s ideologues have acknowledged its changed circumstances and offered explanations for the defeats and loss of territory since the fall of Mosul. But these defenses haven’t been persuasive for some of the organization’s adepts, who have begun questioning why the Islamic State is experiencing a decline.

Similarly, Al-Qaeda isn’t dead and gone, but it’s a shadow of its former self as well. Ayman al-Zawahiri called for new attacks against Americans last month around the anniversary of 9/11. If any al-Qaeda adherents tried, we didn’t notice, and we live in a world where the tools of terror are not difficult to find: vans and steak knives and propane tanks. (By the way, if you ever worry that you’re not aging well, take a look at Zawahiri. He looks older than Si Robertson from Duck Dynasty lately.) These days Zawahiri is complaining about “backtrackers” not being sufficiently committed to jihad. This is the terrorist equivalent of becoming a grumpy old man.

Depending upon how you want to define the term “major,” the last major jihadist terrorist attack on American soil was the Pulse nightclub shooting in Orlando, June 12, 2016. (Others might point to the Minnesota mall stabbing attacks in September of that year, and the concurrent bombings in New York City and New Jersey that thankfully had no fatalities; the following month a Somali refugee tried to run down people on the campus of Ohio State University, injuring 13, but again, thankfully no fatalities.)

The good news — maybe some of the best news for America in a long time — is that the fear of jihadist terrorism on American soil has gradually faded from our cultural landscape and collective consciousness. We no longer feel terrorized by them, and that is the ultimate failure in terrorism.

The bad news is that mass shooters and domestic terrorists appear eager to fill the void.

Revenge of the Public Option


By Robert VerBruggen
Monday, October 28, 2019

A lot of the Democratic presidential candidates are rushing as far to the left on health care as they can. But some in the “center lane” are advocating what they claim is a more moderate approach: finishing what President Obama started and creating a “public option,” meaning an insurance policy provided by the government that purportedly “competes” with private plans. Obama and some of his allies pushed hard to include this in the original Obamacare law, but they got only 58 of the 60 votes they needed in the Senate.

The next time they have a Senate majority, the Democrats will likely try to pass it through a process that requires just 51 votes. This is a lot less alienating to moderate Democrats than Medicare for All ever will be. So even if one of the lefty candidates becomes our next president and enjoys a sympathetic Congress, the public option might prove to be the most politically plausible big-ticket reform.

Let’s take a trip down memory lane and re-explain the problems with this approach. This is a special treat for me, since I first joined National Review in 2008 and spent a lot of time editing articles about Obamacare over the next few years, a period in which we ran plenty of pieces about the public option.

At first glance, the concept may seem harmless enough. Conservatives, after all, are always pointing out how inefficient the government is. One imagines that if a government-run insurer competed on equal terms with private businesses, it wouldn’t be able to keep up. Think the DMV, except facing private competition. But this ignores the economics of the health-care market and the might of the government within it.

More than a quarter of the country’s health-care spending is already covered by the federal government, largely through programs that directly insure patients. And since the government has such powerful control over what will be paid for millions of people’s health care, it can simply underpay providers on a take-it-or-leave-it basis. As the health-care expert Robert Laszewski recently noted, “Medicare pays close to half the price commercial insurers pay hospitals and pays about 20% less than commercial insurance pays doctors.” A study commissioned by the hospital lobby, for what it’s worth, estimates losses in the tens of billions of dollars each year from treating patients covered by Medicare and Medicaid.

This means that people on private insurance “cross-subsidize” people on government plans: To some (much-debated) degree, providers charge the former more so they can afford to take Medicare and Medicaid rates to treat the latter. Some doctors also limit the number of government-insured patients they’ll see.

That’s one of the biggest problems with giving Medicare to all: It drastically cuts payments by eliminating these cross-subsidies. It is, in effect, a huge bet that health-care providers will simply keep operating with tons less revenue. I am sympathetic to the idea that many hospitals charge high rates just because they have market power in their local areas and can get away with it, but that doesn’t mean all hospitals, especially the struggling rural ones, could weather rate cuts so severe.

And the same problem rears its head when you provide a public option “to all who want it.” The only way this product will be attractive to patients is if it forces providers to accept low rates to hold down premiums. That’s why public-option proposals typically require all providers who take Medicare to accept the public option too. (Back in 2009, various estimates held that public-option premiums would be 10 to 30 percent lower than private ones.) And if it goes that route, the public option hammers providers and does not compete with private plans on equal footing.

As Laszewski puts it: “How would you like to run a business and have the government show up with a competing product and use its unilateral power to pay the suppliers you both need half as much as you do?”

Another major problem with a public option is that, unlike a private insurer, it’s unlikely to go bankrupt and disappear if it loses money. As James C. Capretta has written, “there’s no particular reason why a publicly run product couldn’t experience ongoing losses, so long as the law provided for direct or indirect taxpayer subsidization. The Medicare program itself is funded heavily by taxpayer subsidies.”

On the other hand, if a public option really catches on — and the provider lobbies aren’t able to kill its growth — it leads to the single-payer system that moderates claim they don’t want. Not only does it normalize the idea of government health insurance for the masses, but as more people join the public option, providers have to charge the privately insured even higher rates, driving up their premiums and increasing their incentive to join the state-run plan.

This has been an open secret for years and played a role in the original debate over Obamacare. As Rich Lowry recounted in 2009: “A single-payer activist confronted liberal lion Barney Frank with a camera, demanding to know why he didn’t support single-payer. Frank shot back that he favors such a system, only he realizes Obamacare’s public option is the best way to get from here to there.” The same year, The American Prospect ran a piece explaining how the concept bridged what was politically feasible in Congress with the energy on the activist left, which “could live with the public option as a kind of stealth single-payer.”

In 2009 and 2010, there were enough truly moderate Democrats to keep the public option from becoming law. The next time Democrats take over, the dam might break, and then we’ll really be in for a ride.

The Deficit Is a Popularity Problem


National Review Online
Monday, October 28, 2019

It does not take great leadership — or great skill at deal-making — to do things that already are popular. There is nothing easier than giving people what they want when it does not cost you anything. That is one of the basic problems of American politics.

The news that the budget deficit has returned to a point just a hair shy of the trillion-dollar mark is dispiriting. The Trump administration is rightly proud of its economic record of modest but steady growth accompanied by strong employment and very good growth in wages. But if we cannot get government spending under control during the good times, what hope do we have for the more challenging times? And there will be more challenging times.

Congressional Republicans did make some real progress on spending controls during the Obama years, but it is very difficult to resist revenue-hungry special interests — especially when those interest groups represent big blocs of voters.

And budget reform without presidential leadership is more difficult still. The major drivers of federal spending are Social Security, Medicare, Medicaid (along with other health-care subsidies), and national security. President Trump has ruled out pursuing Social Security and Medicare reform out-of-hand. These are very popular entitlements, and particularly popular among some sensitive Republican constituencies. Likewise, military spending is very popular among Republicans, and some conservatives argue, not without reason, that we are not spending enough on the armed services.

We are all for negotiating an extra nickel off every case of pencils the federal bureaucracies order, but the U.S. government is not going to be able to put its fiscal situation on solid footing without addressing the major drivers of spending — meaning entitlement reform. Even if Republicans were willing to countenance the radical tax increases put forward by some leading Democrats, these almost certainly would prove insufficient to cover spending if it continues on its current trajectory. We would need to roughly double federal taxes to make that happen.

Some cynics say that there isn’t any political juice in all this green-eyeshades business, that nobody really cares about the deficit. That is not true, but even if it were true, nobody cared about subprime mortgages, either — until they had to. Washington’s spendthrift ways are setting the U.S. government up for an eventual fiscal crisis, at which point the options for reform will be fewer — and much more painful — than the ones currently before us.

In 2007, the federal deficit was 1.1 percent of GDP. In fiscal year 2019, it was more than four times that. In GDP terms, the deficit has grown larger every year of the Trump administration — and given Republican control of Congress for the first two years of that administration, this is not something that can be blamed exclusively on the Democrats.

While presidential leadership does matter, this ultimately is a congressional issue — it is Congress that authorizes spending and Congress that writes the tax code. And Congress is going to have to act on these and other related issue at some point. The Kyiv circus will pack away its tents, someday — and what will Congress be able to say it has done for the good of the country?

It is time for Washington to sober up and buckle down. Do it before the next $1 trillion in new debt has gone out the door.

Sunday, October 27, 2019

The ‘Global Citizen’ Fraud

By Bruce Bawer
Sunday, October 27, 2019

On September 24, Donald Trump told the United Nations General Assembly that “the future does not belong to the globalists. The future belongs to the patriots.” Four days later, as if in a rebuke to his assertion, the Great Lawn in New York’s Central Park was the site of the “Global Citizen Festival.” This event brought together “top artists, world leaders, and everyday activists to take action” (in the words of its website) and offered free tickets to “Global Citizens who take a series of actions to create lasting change around the world.” Those “actions” included writing tweets and signing petitions affirming their dedication to “changing the world.”

Featuring such entertainers as Alicia Keys and Hugh Jackman, the Global Citizen Festival was organized by a group called Global Citizen in partnership with firms such as Johnson & Johnson, Proctor & Gamble, and Cisco Technologies. Rarely have so many heavyweight corporations described their activities in such benign language: Verizon stated on the event’s website that “we focus our business and resources to uplift people and protect the planet.” Who knew?

Covering the festival live, MSNBC hosts kept insisting—between interviews with Democratic politicians and recitation of DNC talking points—that it was “not about politics.” Hurricane Sandy, Central American drought, and the fall of Venezuela, we were informed, were all caused by climate change. A Mexican official announced her country’s new “feminist foreign policy.” The head of some activist group took credit for the decline in U.S. poverty. Politicians from Norway, Barbados, and elsewhere waved their globalist credentials, while America’s withdrawal from the Paris accords was cited as a sin against globalism and thus against humanity itself.

At the heart of the whole event were the repeated reassurances by those onstage that everybody present was a “global citizen”—and that this was something for which they deserved endless congratulation. Gesturing at the folks lolling around on the sunny Great Lawn, one reporter enthused over the magnificent “commitment” they were making. Representative Adriano Espaillat (D-N.Y.), calling the audience members a “powerful” image of “global citizenship,” was asked what, exactly, they could do to change the world. Glancing back at them lying on the grass, he enthused: “They’re doing it now!” To quote one MSNBC talking head: “Tonight is about community, connection—the world coming together!”

Welcome to the vapid but dangerous new world of global citizenship. I was introduced to it a decade ago while walking in Amsterdam. A rally was taking place on the Dam, the large cobbled square in front of the Dutch Royal Palace. As I approached, some signs and banners came into view. A person cannot be illegal! read one. There is no such thing as an illegal person! read another. (They were in Dutch, with misspellings.) There were many other signs, communicating the message that the term “illegal alien” should be replaced by “undocumented aliens” and that people should be allowed to live wherever they wished.

I knew some basic statistics. I knew how wonderful the Netherlands was, how small it was, and how crowded it was already with its population of 15 or so million. I also knew how many people were out there, in the not-so-wonderful world beyond the West. India, Indonesia, Brazil, Pakistan, Nigeria, Bangladesh, Ethiopia, and the Philippines: Each of these countries had a population (a fast-growing one, at that) in excess of 100 million, a large percentage of whom would doubtless be thrilled to relocate to this tiny kingdom.

The pronouncement that “a person cannot be illegal” made no sense. What else could be said of a citizen from one country living unlawfully in another? Little did I realize that within a few years, such thinking would be mainstream. Little did I realize that in the view of many Americans, “undocumented persons” would not only deserve all the rights of American citizens but would actually deserve special treatment in matters as significant as health care, schooling, and housing—to which they would be considered entitled without being subject to any of the obligations actual citizens of the United States are required to perform.

In the past decade, the very concept of citizenship has become not only passé but déclassé. We should all be global citizens.

***

It’s not a new concept. The first person to call himself a “citizen of the world” was Diogenes, the founder of cynicism. He lived in the fourth century B.C.E. and has been cited in support of the idea. He made this pronouncement, however, only after being stripped of his citizenship in his native city of Sinope and moving, in disgrace, to Athens. In ancient Greece, citizenship was deeply prized. It was inextricable from the idea of civilization. Never before had individuals been afforded the protection of an identity beyond that of family or tribe. The Romans borrowed it from the Greeks and made it something of absolute value. To be a Roman citizen conferred protection and prestige throughout the ancient world. Citizenship meant order. It meant, at a bare minimum, a degree of respect and rights and security that was without parallel in the world of the day.

Ironically enough, the contemporary enthusiasm for global citizenship has its roots in the historical moment that marked the triumph of modern national identity and pride—namely, the World War II victory of free countries (plus the Soviet Union) over their unfree enemies. Citizens of small, conquered nations resisted oppression and, in many cases, gave their lives out of sheer patriotism and love of liberty. As Allied tanks rolled into one liberated town after another, people waved flags that had been hidden away during the occupation. Germany and Japan had sought to create empires that erased national borders and turned free citizens into subjects of tyranny; brave patriots destroyed that dream and restored their homelands’ sovereignty and freedom. And yet a major consequence of this victory was the establishment of an organization, the United Nations. Its founding rhetoric, like that of Nazi Germany and Imperial Japan, was all about the erasure of borders, even as it hoisted its own baby-blue flag alongside those of its members.

On December 10, 1948, the UN adopted the Universal Declaration of Human Rights. The rights it enumerates emanate from the DNA of modern Western nation-states; they can be traced to Magna Carta and were articulated in the U.S. Declaration of Independence and Bill of Rights. But the UN Declaration departs from its British and American antecedents in significant ways. While affirming freedom of speech and due process, noted E. Jeffrey Ludwig in an article posted at the American Thinker, it “point[ed] the way towards intervention by the UN in the daily lives of people” by, for example, “assert[ing] the right to food, clothing, medical care, social services, unemployment and disability benefits, child care, and free education,” plus more abstract rights, such as the “right freely to participate in the cultural life of the community…and to enjoy the arts.”

The chief force behind the Declaration was Eleanor Roosevelt, the chair of the UN’s Human Rights Commission. In a 1945 newspaper column, she had had some interesting things to say about patriotism and what we would now call globalism. “Willy-nilly,” she wrote, “everyone [sic] of us cares more for his own country than for any other. That is human nature. We love the bit of land where we have grown to maturity and known the joys and sorrows of life. The time has come however when we must recognize that our mutual [sic] devotion to our own land must never blind us to the good of all lands and of all peoples.”

“Willy-nilly”? “Bit of land”? Didn’t America deserve better than that from its longtime first lady? Didn’t America’s armed forces, who had fought valiantly for their own “bit of land”? One part of Mrs. Roosevelt’s testimony was ambiguous. When she referred to “the good of all lands and of all peoples,” did she mean that Americans should care about what’s best for other peoples? Or was she saying that all lands and peoples are good? She couldn’t possibly be saying that, could she? Hadn’t the Holocaust just proven otherwise? It’s striking to recognize that Mrs. Roosevelt wrote this only months after the bloody end of the crusade to restore freedom to Western Europe—and at a time when our erstwhile ally Joseph Stalin’s actions in Eastern Europe were underscoring precisely how evil our fellow man could be, and just how precious a gift to the world the United States was.

Although the Universal Declaration passed in the General Assembly, 48–0, eight nations—the USSR, Byelorussia, Czechoslovakia, Poland, Ukraine, Yugoslavia, South Africa, and Saudi Arabia—abstained. This rendered the document essentially pointless, a statement of Western values that much of the Soviet bloc and one of the most powerful countries in the Arab world rejected. Other Muslim nations signed on, and their insincerity in doing so was later reflected in the Organization of Islamic Cooperation’s 1990 Cairo Declaration on Human Rights in Islam, which defines human rights in a way that is founded entirely on sharia law and is utterly at odds with Western values.

Another would-be global citizen was Wendell Willkie, who had challenged FDR for the presidency in 1940. In 1943, Willkie published One World, an account of a round-the-world trip he had made and a plea for the nations of that world to accept a single international order. Willkie wanted more than just a UN: He wanted world government, based on the Atlantic Charter. It is said that his book was the biggest non-fiction bestseller in history up to that time, inspiring an international One World movement to which both Albert Einstein and Mahatma Gandhi belonged. Like Eleanor Roosevelt, Willkie was determined to build a new world founded on specifically American notions of rights and freedoms. Like Mrs. Roosevelt, too, he was convinced that postwar feelings of goodwill toward the U.S. by other governments would lead them to embrace those notions. On his world trip, wrote Willkie, he had discovered that foreigners knew that America had no desire for conquest, and that the U.S. therefore enjoyed their respect and trust—a respect and trust, he argued, that America must use “to unify the peoples of the earth in the human quest for freedom and justice.”

Needless to say, the world didn’t end up with Willkie’s One World. But it got the UN—where, from the outset, there was more talk of peace than of freedom and where the differences between the West and the Soviet bloc were routinely glossed over in order to present a façade of international comity. Behind the Iron Curtain, captive peoples weren’t citizens, global or otherwise, but prisoners. Yet in the West, the UN’s language of what we now call global citizenship started to take hold, and the UN began to be an object of widespread, although hardly universal, veneration. In reality, the UN may be a massive and inert bureaucratic kleptocracy yoked to a debating society, most of whose member states are unfree or partly free; but people in the free world who grow starry-eyed at the thought of global citizenship view it as somehow magically exceeding, in moral terms, the sum of its parts.

***

You can’t discuss the UN and global citizenship without mentioning Maurice Strong. “A very odd thing happened last weekend,” wrote Christopher Booker in the Telegraph in December 2015. “The death was announced of the man who, in the past 40 years, has arguably been more influential on global politics than any other single individual. Yet the world scarcely noticed.” What Strong, an extremely rich Canadian businessman, did—almost single-handedly—was to create, out of the blue, the global-warming panic that is now a cornerstone of left-wing ideology. Although he never was secretary-general of the UN, Strong wielded massive power within that organization and innumerable other international bodies, serving, for instance, as a director of the World Economic Forum and as a senior adviser to the president of the World Bank. He also played pivotal roles in a long list of programs and commissions that were nominally dedicated to the environment—among them the UN Environmental Programme and World Resources Institute, the Earth Charter Commission, and the UN’s World Commission on Environment and Development.

But although he was nicknamed “Godfather of Global Warming,” Strong didn’t really care about climate. His real objective was to transform the UN into a world government—a permanent, unelected politburo composed of elders such as himself. At first, indeed, climate played no role in his plans. To fund the all-powerful UN of his dreams, in 1995 he proposed a 0.5 percent tax on every financial transaction on earth—a scheme that would have netted $1.5 trillion annually, approximately the entire annual gross income of the United States at the time. When the Security Council vetoed this move, Strong tried to eliminate the Security Council. The failure of such stratagems led Strong to focus increasingly on climate. By promoting the idea that the planet was in existential peril, he was able to argue that a looming disaster on the scale he predicted could be solved only by vesting in the UN an unprecedented degree of authority over the lives of absolutely everyone on earth.

To this end, Strong concocted Agenda 21. Formulated at the 1992 UN Earth Summit (or Rio Conference), of which he served as secretary-general, Agenda 21 proposed a transfer of power from nation-states to the UN. “It is simply not feasible for sovereignty to be exercised unilaterally by individual nation states,” Strong explained. “The global community must be assured of global environmental security.” What kind of regime did Strong wish to establish? Suffice it to say that he disdained the U.S. but admired Communist China, where he maintained a flat—to which, incidentally, he relocated after being implicated in the UN “oil for food” scandal in 2005. Another one of the many financial scandals in which he was implicated (but for which he repeatedly managed to get himself off the hook) involved funneling massive sums to North Korea, of whose regime he was also fond.

Strong was the spiritual father of all those global citizens who today fly thousands of miles in private jets to swanky conferences at which they give speeches chiding their inferiors for not recycling. One such personage is Al Gore, whose house is known to have one of the largest carbon footprints in Tennessee. Another is Nicholas Kristof, the New York Times columnist who promotes an initiative, Global Citizen Year, which seeks to “engag[e] young Americans in global issues.” With his wife, Sheryl WuDunn, Kristof wrote the 2014 book A Path Appears, described by its publisher as “a roadmap to becoming a conscientious global citizen.” Kristof has argued that Americans should contribute to foreign rather than domestic causes, because “an aid group abroad can save more lives more cheaply than an organization in the United States, and generally can do more good with less money.” Never mind the ample proof that foreign aid more often than not does more harm than good—encouraging dependency, fostering resentment, crushing initiative, lining the pockets of dictators and their cronies, and preventing poor countries from developing healthy economies.

After the UN came the European Union. As a free-trade zone gradually morphed into a would-be superstate, the EU’s supposed raison d’être was that nationalism had almost destroyed Europe in World War II. But this was wrong. Europe had been torn apart because of two totalitarian ideologies, one based on racial identity and the other on a utopian universalist vision. Communism’s end goal was, indeed, nothing more or less than a kind of global citizenship under which everyone except for a handful of elites would be equally controlled, spied on, and oppressed.

The global-citizenship mentality ramped up with the 1960s. No one expressed it more memorably than John Lennon in “Imagine,” a 1971 song whose influence has been immeasurable.

“Imagine there’s no countries,” Lennon wrote, going on to imply that without countries there would be “nothing to kill or die for,” so that “all the people” on earth would be “living life in peace” and, indeed, the world would “be as one.” The song, which to this day remains a ubiquitous protest anthem, has led millions of starry-eyed idealists to equate nationhood with war and patriotism with killing and to believe that a borderless planet would be a peaceful one. The song has also helped spread the view that simply imagining a perfect world is equivalent to, or even better than, doing the hard work of creating a better, if still imperfect, world—hence the inane comments at this year’s Global Citizen Festival to the effect that the attendees, just by being there, were actually accomplishing something.

***

The concept of global citizenship now pervades our politics. During her 2016 campaign, Hillary Clinton envisioned a Western hemisphere, and ultimately a world, without borders. Barack Obama, in reply to a question about American exceptionalism, said that, yes, he saw America as exceptional, but that people in other countries, too, saw their countries as exceptional. The last sentence of his Nobel Peace Prize citation contained the word “global” not once but twice: “The Committee endorses Obama’s appeal that ‘Now is the time for all of us to take our share of responsibility for a global response to global challenges.’” What U.S. president had ever been more global? A Kenyan father, an Indonesian boyhood: his bestselling autobiography conveyed his affection for both of those countries; it was the U.S. for which his feelings were ambivalent.

The concept of global citizenship also dominates our popular culture. In a 2018 book, Hollywood Heyday, David Fantle and Tom Johnson write about attending a 1981 church service with film director Frank Capra, then 93. To honor the recently released Tehran hostages, the recessional hymn was “America (My Country ’Tis of Thee).” All four verses, three of them obscure, were sung. Congregants were handed lyric sheets. Capra didn’t give his sheet so much as a glance. He knew every word of every verse by heart, and sang with emotion. What member of today’s Hollywood elite could do that? More typical of the attitude of movie people nowadays was a remark made during an onstage interview at the 2016 PEN World Voices Festival by screenwriter Richard Price. Asked about American identity, he replied: “I always feel like I live in the country of New York.” The interviewer replied: “Whenever I’m traveling and people ask if I’m American, I say I’m a New Yorker.” Price replied: “I always say I’m Canadian because I don’t know who I’m talking to.”

One of the conceits of America popular culture is the idea that the human race would come together in a trice—the ultimate pipe dream of global citizens—if confronted by a common enemy. In Independence Day (1996), the world responds as one to an attack by space aliens and the U.S. president gives a pep talk to American participants in the common defense:

In less than an hour, aircraft from here will join others from around the world. And you will be launching the largest aerial battle in the history of mankind.

Mankind. That word should have new meaning for all of us today. We can’t be consumed by our petty differences any more….Perhaps it’s fate that today is the Fourth of July….Should we win the day, the Fourth of July will no longer be known as an American holiday, but as the day when the world declared in one voice, “We will not go quietly into the night!”

In Independence Day, as is almost invariably the case in such films, international cooperation is premised on American values—just like the founding of the UN. Routinely, people call themselves global citizens without recognizing in the slightest the extent to which their sense of the global is rooted in uniquely American ways of thinking.

Global citizenship is also big at America’s most prestigious colleges. “Global engagement” is a featured category on the main page of the Brown University website. Type in dartmouth.edu and you’ll find the category “Global” alongside “Admissions,” “Schools,” “Centers,” “Arts,” and “Athletics.” On the main page of Columbia University’s site, “Global” is right up there with “Libraries,” “Arts,” and “Athletics.” On Duke’s main page, the categories are “Admissions,” “Academics,” “Research,” “Arts,” “Schools & Institutes,” and—yes—“Global.” The same is true of the websites of any number of other major U.S. colleges.

What do you get when you click on “Global” on these sites? Well, at Columbia’s site you’ll encounter a comment by its president, Lee C. Bollinger: “We all need to be explorers again, rediscovering what the world is like and what it means to think globally.” (Recall that Bollinger’s own most prominent contributions to “thinking globally” were his speaking invitations, in 2007, to Iran’s Mahmoud Ahmadinejad and, earlier this year, to Mathatir Muhammed of Indonesia—both virulent Jew-haters.) Bollinger’s bemusing rhetoric typifies the way in which these institutions discuss global citizenship. When I checked the Yale website recently, front and center on its main page was the statement that Yale “engages with people and institutions across the globe in the quest to promote cultural understanding, improve the human condition, delve deeper into the secrets of the universe, and train the next generation of world leaders.” Oh, is that all? The site quotes Yale “partner” Vincent Biruta, Rwanda’s minister for the environment: “Partnerships like the ones we have forged today are especially critical when addressing complex global challenges.”

The words complex and challenges get a real workout on these sites. The pitch for Columbia’s M.A in Global Thought calls it “an interdisciplinary academic course of study that challenges students to explore new concepts and categories intended to encompass and explain the complexities of our interconnected and changing world.” M.A. students will come to understand “global thinking as a process rather than a product” and be supported “in their development of insights about the changing world.” One course, “Global Governance Regimes,” “explores the challenges of thinking about and effectuating governance in a global era.” Globalization, you see, “poses new challenges for thinking about the concept of governance.” Meanwhile, on Dartmouth’s site, you can read an item entitled “How Can Students Be Good Global Citizens?” The Dartmouth campus, we learn, features the “Global Village,” a “residential community” that “holistically equips students to thrive as ethical, engaged, and responsible world citizens and scholars” and enables them “to explore complex international issues” and engage in “focused reflection.”

Decades ago, American curricula included a subject called “civics.” Students learned about responsible citizenship—understanding how government worked, knowing one’s constitutional rights, following current affairs, and voting intelligently in elections. Describing these courses was not problematic; students weren’t “invited” or “challenged” to “figure out” what citizenship means. They were told. They were given specifics. They experienced something known as education. Alas, those civics courses have long since disappeared. The contemplation of global citizenship has filled that vacuum. Its apparent purpose is to undo any sense of responsible citizenship that a young person might have acquired and to replace it with a higher loyalty.

I began this article by mentioning the Global Citizen Festival. One of its two co-founders is Hugh Evans, described on his Wikipedia page as “an Australian humanitarian.” He gave a TED talk in 2016 titled “What does it mean to be a citizen of the world?” Evans praised this “growing movement” of “global citizens” who identify “first and foremost not as a member of a state, a tribe, or a nation, but as a member of the human race.” Saying that “the world’s future depends on global citizens,” Evans maintained that if we were all global citizens, we “could solve every major problem in the world,” because those problems are all “global issues” and can therefore “only be solved by global citizens.”

How did Evans become a global citizen? It happened, he recounts, during a brief stay in a Philippines slum whose residents wore rags and slept on garbage heaps. Why, he wondered, was his life so much better than theirs? The answer he came up with was this: Their poverty was the result of colonialism. International economics, he concluded, is a zero-sum game: If some countries are rich, it’s because they’ve exploited countries that are poor. Granted, this belief hasn’t led Evans to give up his wealth. But he’s certainly made a great show of guilt about it. It’s barely an exaggeration to say that he makes a career out of traveling from place to place, standing at lecterns and expressing solidarity with people who sleep on rubbish heaps. Note, however, that you’re not likely to hear those slum dwellers describing themselves as global citizens. They’re tied by poverty to the places where they were born.

One wonders: Would any Brit who went through the Blitz ever have called himself a “global citizen”? Would any American whose father died in a Nazi POW camp ever have called himself a “global citizen”? I doubt it. Global citizenship is a luxury of those who’ve reaped rewards earned by the blood of patriots. Global citizens pretend to possess, or sincerely think they possess, a loyalty that transcends borders. It sounds pretty. But it’s not. By the same token, to some ears a straightforward declaration of patriotism can sound exclusionary, bigoted, racist. It isn’t. To assert a national identity is to make a moral statement and to take on a responsibility. To call yourself a global citizen is to do the equivalent of wearing a peace button—you’re making a meaningless statement because you think it makes you look virtuous.

Think of love. To say that you care first and foremost about your own family doesn’t mean that you hate other families; it’s merely a question of being honest about something that, in the real world, entails commitment and sacrifice. In matters of loyalty, as in matters of love, there are hierarchies. To love everyone is to love no one. To say that you love all humanity is a pretty lie. As former British Prime Minister Theresa May said in 2016, in one of her rare deviations into sense, “if you believe you are a citizen of the world, you are a citizen of nowhere.”

To be American is to partake in the benefits that flow from American freedom, power, wealth, and world leadership. Very few Americans who call themselves global citizens ever actually back up their proclamation by relinquishing any of these benefits—that might be worthy of respect. No, they gladly embrace the benefits of being an American; they’re just too virtuous, in their minds, to embrace the label itself. They’re like young people living off a generous trust fund while sporting an “Eat the Rich” button.

One way of looking at the aftermath of 9/11 is to recognize that many Americans who were simply unable (for very long, anyhow) to dedicate themselves to country were thrust by that jihadist assault into the arms of the only alternative they could imagine—namely, global citizenship. Instead of being usefully dedicated to the liberty and security of their own country in a time of grave threat, they have bailed on America and have found, in global citizenship, a noble-sounding illusion of freedom from patriotic obligation. And in fact they are floating free, hovering above the earthly struggle between good and evil and refusing to take sides—and, moreover, presenting this hands-off attitude as a mark not of cowardice but of cultural sophistication and moral superiority.

To a large extent, the project of global citizenship is about trying to replace the concrete with the abstract, about exchanging the real for the idealistic. It’s a matter of trying to talk Americans into rejecting the pragmatic and industrious patriotism that, yes, made America great, and pushing on them, instead, yet another pernicious utopian ideology of the sort that almost destroyed Europe in the 20th century. It’s a matter of endlessly talking up ideas for radical change on every level of society—from ecological measures that would bring down the world economy to a neurotic obsessiveness with hierarchies of group identity that threatens to destroy America’s social fabric—instead of implementing practical reforms that enjoy popular support and would improve everyone’s life. It’s a matter of trying to persuade ordinary citizens, in the name of some higher good—whether world peace or world health or protection of the planet’s environment—to relinquish their freedom and obey a small technocratic elite. In the final analysis, global citizenship is a dangerous dream, a threat to individual liberty, and an assault on American sovereignty—a menace not only to Americans but to all humanity, and one that should therefore be rejected unambiguously by all men and women of goodwill and at least a modicum of common sense.