By Jonah Goldberg
Friday, December 31, 2010
One of the best things I read all year came in a magazine I don't read, about a subject I don't follow, by an author I don't agree with on nearly anything. But Matt Taibbi's "The Jock's Guide to Getting Arrested" in the August 2010 Men's Journal was simply a great piece of writing.
I bring it up because Michael Vick is in the news, and Taibbi's rule No. 1 for athletes who get arrested is "Don't Suck."
"Before you go out and start committing crimes," Taibbi writes, "it's important to first make sure you're at least slightly better than the 30 or 40 guys the team's assistant GM could instantly pull off some practice squad to replace you. Otherwise you will become fodder for the team's zero-tolerance discipline policy. Conversely, if you're awesome, the line will be, 'There've been some bumps in the road, but hopefully he's learned from that.'"
Enter Vick, a star quarterback for the Atlanta Falcons until he was caught, tried and convicted for dog fighting and animal cruelty. Vick had been warned to abandon the hobby he loved, but he just couldn't resist brutalizing dogs for sport and drowning the losers.
He went to jail for 21 months, lost a vast amount of money, and was publicly shamed for his misdeeds.
As a dog lover of the first order, I can sympathize with the sentiment behind pundit Tucker Carlson's hyperbole about wanting Vick executed for his crimes. But at the end of the day, nearly two years in jail, personal bankruptcy and the loss of some prime playing years is a reasonable punishment. Though I think lifetime banishments should be more common in professional sports. (Why is betting on, say, basketball more of a reason to ban someone than betting on dogfights? No one drowns the Celtics when they lose.)
Anyway, because Vick is a close student of Taibbi's First Rule, he doesn't suck. Which is why he was picked up by the Philadelphia Eagles. Fine, one more reason not to root for the Eagles.
But the story leapt from the sports pages to the editorial pages because the president called the team's owner, Jeffrey Lurie, to congratulate him for giving Vick another shot in the NFL.
"So many people who serve time never get a fair second chance." He reportedly told Lurie. "It's never a level playing field for prisoners when they get out of jail."
Obama is surely right that ex-cons face a lot of hurdles in life. But is Vick really a good example?
Michael Vick had it all. He received a $37 million signing bonus when he joined the Falcons. Washington Post columnist Richard Cohen, another liberal writer I rarely agree with, is exactly right when says: "You would think from the commentary that Vick was some poor kid who got caught swiping something so he could get something to eat. You would think he had on the spur of the moment stolen a car, gone for a joy ride -- and collided with a police car at an intersection."
No, this guy was such a self-involved creep that he couldn't stop himself from running a gladiatorial canine torture mill, even though he knew he was risking everything.
And why did Obama wait until Vick's second year with the Eagles to congratulate Lurie for his "brave" decision? It was only after an entire year as a backup quarterback had passed and Vick emerged as a superstar again, taking his team to the playoffs and coming in second in Pro Bowl voting, that Obama congratulated Lurie. If Lurie's decision was so virtuous, why wait until the decision panned out?
The answer, again: Vick doesn't suck. At football.
But what of the millions of men released from prison who have little education, few skills and a criminal record that would make any reasonable employer think twice, and then twice again, about hiring them?
If our prisons were releasing top-flight software engineers, physicists and biologists, they'd all get second chances too. Ironically, it's the folks who need government licenses -- doctors, stockbrokers, et al. -- who often can't get second chances in their vocations. Obama could actually do something about that.
How to deal with the ex-con population is very hard knot. But neither Michael Vick's example nor the self-flattering preening that has come with it helps cut through it.
Friday, December 31, 2010
Could France Go Even Further Right?
By Rachel Marsden
Friday, December 31, 2010
With less than a year and a half to go before the next French presidential election, and current President Nicolas Sarkozy at 35% popularity, could France end up shifting further to the right? The possibility is a good one. Sarkozy won the 2007 race by largely borrowing from the right-wing Front National party’s platform on everything from immigration reform to national security. Keeping in mind that French presidential elections usually consist of two rounds of voting (if no one party gets 50% of votes in the first round), with other parties throwing their weight behind one of the remaining two parties in exchange for concessions and government positions, Sarkozy beat Socialist Segolene Royale in 2007 because he nabbed right-wing and centrist party votes after the first round, which put him over the top in the second and final vote.
The advantage Sarkozy had in 2007 is that he had never been given the chance to be in charge. He always had to dodge the long shadow of his party’s leader, Jacques Chirac, whom few outside of France (or even in France) could ever legitimately label right-leaning or laissez faire. Chirac spent his mandate importing his beloved Africa into France one Muslim immigrant at a time, appreciating the various resulting cultural manifestations such as bar-b-cueing Citroens as a form of public debate, and taking advantage of various sanctions imposed by the world community on dodgy regimes like Saddam Hussein’s to enjoy market monopoly free of any legitimate competition.
Sarkozy was supposed to be a break from all that. But then something got in the way: France. The Economist, having hailed Sarkozy as the new Napoleon, recently called him the “incredible shrinking president”, and criticized him for not following through on his good ideas. The criticism came from the right, not from the left. The result of gridlock or inaction is maintenance of the status quo, which in France is nanny-state left. And this isn’t what a majority of French voted for when they elected Sarkozy.
A summary of the disappointments:
*Sarkozy pushed DNA testing legislation through parliament to ensure actual relation between family reunification immigrants. But when the bill had passed and was ready to be signed, he told his minister not to.
* After running a campaign emphasizing cultural integration and the secular nature of the French state, Sarkozy sent his prime minister – the head of French government -- to open a new mosque during which he made warm declarations about Islam.
* While the value of the Euro is on a roller-coaster ride because some member countries can’t sort out their own messes to the point of just failing and requiring a bailout from others, Sarkozy has been spending his time strong-arming German Chancellor Angela Merkel into coughing up German productivity to help out the deadbeats. Some French don’t understand why they should be forced to pay for Greeks to riot.
* A presidential campaign emphasizing meritocracy has given way to parachuting friends and relatives, and firing critics in both the private and public sectors. The fact that Sarkozy has referred to himself, in various contexts, as the “head of human resources” probably doesn’t help.
* French companies and factories are disappearing from the landscape and setting up shop overseas. Rather than reducing taxes and gold-plated benefits all around and explaining to people that not doing so will choke off their livelihood entirely, Sarkozy just offered the companies goodie-bags* (*cash). Most normal people can’t fathom handing over a wad of money to their significant other without being considered crass, so imagine the reaction of an entire population when money is given to big business – and the money doesn’t come from the giver but rather from other people’s pay checks. “The French should be used to that,” you might say. But no, the French aren’t used to seeing it done that overtly. When their pay checks are stolen from them by the State, it is done underhandedly in the interest of “benefits”, making them think they’re getting something back, or at the very least that it’s going to their employer: the government. In this case, it’s just shooting down a tube directly into other people’s hands.
This is not to say that Sarkozy hasn’t made a dent. He’s planted the seed of change in France, such as getting the French used to the idea of working two years longer – which shut down the country in itself. But the change promised in 2007 hasn’t yet become a reality to the extent it was perhaps anticipated by voters.
So what are these voters to do? Well, this time, they have an alternative that’s as right-leaning as Sarkozy’s UMP party in theory, and perhaps may actually be more so in practice: therein lies the real difference between the two. The Front National party, led by Jean-Marie Le Pen, is set to be taken over in the coming weeks by his 42-year old lawyer daughter, Marine: a smart, tough, articulate lawyer and mother with a knack for leading debate and appealing directly to the people over the heads of the elites. In a recent example, she criticized Muslims who flood into the streets and jam entire blocks of public space to conduct their prayers to Allah as “occupying”. A flood of denouncements came from all other parties, along with threats from various minority and ethnic interest groups to take legal action against the remark. A Socialist Party leader conceded that the phenomenon is indeed a problem – while everyone else living in the real world and having to manoeuvre around these sessions can actually see that it is. French people are now supporting Marine Le Pen in record numbers (27% versus Sarkozy’s 35%).
So let’s imagine a scenario that could very well occur if Sarkozy’s popularity continues to decline or remain low. If, in the first round of 2012 presidential voting, Sarkozy’s right-leaning base votes against him and in favour of the Front National (in protest or otherwise), the centrists vote for their own various candidates and divides the center-right, and the left rallies around the Socialist Party, this could result in a second round race between the Front National and the Socialist Party. And while centrists may side with the Socialists, the right and traditional UMP voters would rally around the Front National.
In a country where a presidential candidate can go from 65% popularity to tanking with 18% in the first round of voting (Prime Minister Edouard Balladur in 1995), it’s not unfeasible to imagine the possibility of France now moving further right.
Friday, December 31, 2010
With less than a year and a half to go before the next French presidential election, and current President Nicolas Sarkozy at 35% popularity, could France end up shifting further to the right? The possibility is a good one. Sarkozy won the 2007 race by largely borrowing from the right-wing Front National party’s platform on everything from immigration reform to national security. Keeping in mind that French presidential elections usually consist of two rounds of voting (if no one party gets 50% of votes in the first round), with other parties throwing their weight behind one of the remaining two parties in exchange for concessions and government positions, Sarkozy beat Socialist Segolene Royale in 2007 because he nabbed right-wing and centrist party votes after the first round, which put him over the top in the second and final vote.
The advantage Sarkozy had in 2007 is that he had never been given the chance to be in charge. He always had to dodge the long shadow of his party’s leader, Jacques Chirac, whom few outside of France (or even in France) could ever legitimately label right-leaning or laissez faire. Chirac spent his mandate importing his beloved Africa into France one Muslim immigrant at a time, appreciating the various resulting cultural manifestations such as bar-b-cueing Citroens as a form of public debate, and taking advantage of various sanctions imposed by the world community on dodgy regimes like Saddam Hussein’s to enjoy market monopoly free of any legitimate competition.
Sarkozy was supposed to be a break from all that. But then something got in the way: France. The Economist, having hailed Sarkozy as the new Napoleon, recently called him the “incredible shrinking president”, and criticized him for not following through on his good ideas. The criticism came from the right, not from the left. The result of gridlock or inaction is maintenance of the status quo, which in France is nanny-state left. And this isn’t what a majority of French voted for when they elected Sarkozy.
A summary of the disappointments:
*Sarkozy pushed DNA testing legislation through parliament to ensure actual relation between family reunification immigrants. But when the bill had passed and was ready to be signed, he told his minister not to.
* After running a campaign emphasizing cultural integration and the secular nature of the French state, Sarkozy sent his prime minister – the head of French government -- to open a new mosque during which he made warm declarations about Islam.
* While the value of the Euro is on a roller-coaster ride because some member countries can’t sort out their own messes to the point of just failing and requiring a bailout from others, Sarkozy has been spending his time strong-arming German Chancellor Angela Merkel into coughing up German productivity to help out the deadbeats. Some French don’t understand why they should be forced to pay for Greeks to riot.
* A presidential campaign emphasizing meritocracy has given way to parachuting friends and relatives, and firing critics in both the private and public sectors. The fact that Sarkozy has referred to himself, in various contexts, as the “head of human resources” probably doesn’t help.
* French companies and factories are disappearing from the landscape and setting up shop overseas. Rather than reducing taxes and gold-plated benefits all around and explaining to people that not doing so will choke off their livelihood entirely, Sarkozy just offered the companies goodie-bags* (*cash). Most normal people can’t fathom handing over a wad of money to their significant other without being considered crass, so imagine the reaction of an entire population when money is given to big business – and the money doesn’t come from the giver but rather from other people’s pay checks. “The French should be used to that,” you might say. But no, the French aren’t used to seeing it done that overtly. When their pay checks are stolen from them by the State, it is done underhandedly in the interest of “benefits”, making them think they’re getting something back, or at the very least that it’s going to their employer: the government. In this case, it’s just shooting down a tube directly into other people’s hands.
This is not to say that Sarkozy hasn’t made a dent. He’s planted the seed of change in France, such as getting the French used to the idea of working two years longer – which shut down the country in itself. But the change promised in 2007 hasn’t yet become a reality to the extent it was perhaps anticipated by voters.
So what are these voters to do? Well, this time, they have an alternative that’s as right-leaning as Sarkozy’s UMP party in theory, and perhaps may actually be more so in practice: therein lies the real difference between the two. The Front National party, led by Jean-Marie Le Pen, is set to be taken over in the coming weeks by his 42-year old lawyer daughter, Marine: a smart, tough, articulate lawyer and mother with a knack for leading debate and appealing directly to the people over the heads of the elites. In a recent example, she criticized Muslims who flood into the streets and jam entire blocks of public space to conduct their prayers to Allah as “occupying”. A flood of denouncements came from all other parties, along with threats from various minority and ethnic interest groups to take legal action against the remark. A Socialist Party leader conceded that the phenomenon is indeed a problem – while everyone else living in the real world and having to manoeuvre around these sessions can actually see that it is. French people are now supporting Marine Le Pen in record numbers (27% versus Sarkozy’s 35%).
So let’s imagine a scenario that could very well occur if Sarkozy’s popularity continues to decline or remain low. If, in the first round of 2012 presidential voting, Sarkozy’s right-leaning base votes against him and in favour of the Front National (in protest or otherwise), the centrists vote for their own various candidates and divides the center-right, and the left rallies around the Socialist Party, this could result in a second round race between the Front National and the Socialist Party. And while centrists may side with the Socialists, the right and traditional UMP voters would rally around the Front National.
In a country where a presidential candidate can go from 65% popularity to tanking with 18% in the first round of voting (Prime Minister Edouard Balladur in 1995), it’s not unfeasible to imagine the possibility of France now moving further right.
Big Labor’s Snowmageddon Snit Fit
Politicized and abusive unions were deadly during the blizzard.
Michelle Malkin
Friday, December 31, 2010
Diligent English farmers of old once shared a motto about the blessings of work: “Industry produces wealth, God speed the plow.” Indolent New York City union officials who oversee snow-removal apparently live by a different creed: Sloth enhances political power, Da Boss slow the plow.
Come rain or shine, wind, sleet, or blizzard, Big Labor leaders always demonstrate perfect power-grabby timing when it comes to shafting taxpayers. Public-sector unions are all-weather vultures ready, willing, and able to put special-interest politics above the citizenry’s health, wealth, and safety. Confirming rumors that have fired up the frozen metropolis, the New York Post reported Thursday that government sanitation and transportation workers were ordered by union supervisors to oversee a deliberate slowdown of its cleanup program — and to boost their overtime paychecks.
Why such vindictiveness? It’s a cold-blooded temper tantrum against the city’s long-overdue efforts to trim layers of union fat and move toward a more efficient, cost-effective privatized workforce.
Welcome to the Great Snowmageddon Snit Fit of 2010.
New York City councilman Dan Halloran (R., Queens) told the Post that several brave whistleblowers confessed to him that they “were told (by supervisors) to take off routes (and) not do the plowing of some of the major arteries in a timely manner. They were told to make the mayor pay for the layoffs, the reductions in rank for the supervisors, shrinking the rolls of the rank-and-file.”
Denials and recriminations are flying like snowballs. But even as they scoff at reports of this outrageous organized job action, the city sanitation-managers’ unions openly acknowledge their grievances and “resentment” over job cuts. Stunningly, sanitation workers spilled the beans on how city plowers raised blades “unusually high” (which requires extra passes to get their work done) and refused to plow anything other than assigned streets (even if it meant leaving behind clogged routes to get to their blocks).
When they weren’t sitting on their backsides, city plowers were caught on videotape maniacally destroying parked vehicles in a futile display of Kabuki Emergency Theater. It would be laugh-out-loud comedy if not for the death of at least one newborn whose parents waited for an ambulance that never came because of snowed-in streets.
This isn’t a triumphant victory for social justice and workers’ dignity. This is terrifying criminal negligence.
And it isn’t the first time New York City sanitation workers have endangered residents’ well-being. In the 1960s, a Teamsters-affiliated sanitation workers’ strike led to trash fires, typhoid warnings, and rat infestations, as 100,000 tons of rotting garbage piled up. Three decades later, a coordinated job action by city building-service workers and sanitation workers caused another public trash nuisance declared “dangerous to life and health” in the Big Apple.
New Yorkers could learn a thing or two from those of us who call Colorado Springs home. We have no fear of being held hostage to a politically driven sanitation department — because we have no sanitation department. We have no sanitation department because enlightened advocates of limited government in our town realized that competitive bidders in the private sector could provide better service at lower cost.
And we’re not alone. As the Mackinac Center for Public Policy in Michigan reported: “The largest study ever conducted on outsourced garbage collection, conducted by the federal government in the 1970s, reported 29 to 37 percent savings in cities with populations over 50,000. A 1994 study by the Reason Foundation discovered that the city of Los Angeles was paying about 30 percent more for garbage collection than its surrounding suburbs, in which private waste haulers were employed. A 1982 study of city garbage collection in Canada discovered an astonishing 50 percent average savings as a result of privatization.”
Completely privatized trash collection means city residents don’t get socked with the bill for fraudulently engineered overtime pay, inflated pensions, and gold-plated health benefits in perpetuity — not to mention the capital and operating costs of vehicles and equipment. The Colorado Springs model, as city councilman Sean Paige calls it, is a blueprint for how every city can cope with budget adversity while freeing itself from thuggish union threats when contracts expire or cuts are made. Those who dawdled on privatization efforts in better times are suffering dire, deadly consequences now.
Let the snow-choked streets of New York be a lesson for the rest of the nation: It’s time to put the Big Chill on Big Labor–run municipal services.
Michelle Malkin
Friday, December 31, 2010
Diligent English farmers of old once shared a motto about the blessings of work: “Industry produces wealth, God speed the plow.” Indolent New York City union officials who oversee snow-removal apparently live by a different creed: Sloth enhances political power, Da Boss slow the plow.
Come rain or shine, wind, sleet, or blizzard, Big Labor leaders always demonstrate perfect power-grabby timing when it comes to shafting taxpayers. Public-sector unions are all-weather vultures ready, willing, and able to put special-interest politics above the citizenry’s health, wealth, and safety. Confirming rumors that have fired up the frozen metropolis, the New York Post reported Thursday that government sanitation and transportation workers were ordered by union supervisors to oversee a deliberate slowdown of its cleanup program — and to boost their overtime paychecks.
Why such vindictiveness? It’s a cold-blooded temper tantrum against the city’s long-overdue efforts to trim layers of union fat and move toward a more efficient, cost-effective privatized workforce.
Welcome to the Great Snowmageddon Snit Fit of 2010.
New York City councilman Dan Halloran (R., Queens) told the Post that several brave whistleblowers confessed to him that they “were told (by supervisors) to take off routes (and) not do the plowing of some of the major arteries in a timely manner. They were told to make the mayor pay for the layoffs, the reductions in rank for the supervisors, shrinking the rolls of the rank-and-file.”
Denials and recriminations are flying like snowballs. But even as they scoff at reports of this outrageous organized job action, the city sanitation-managers’ unions openly acknowledge their grievances and “resentment” over job cuts. Stunningly, sanitation workers spilled the beans on how city plowers raised blades “unusually high” (which requires extra passes to get their work done) and refused to plow anything other than assigned streets (even if it meant leaving behind clogged routes to get to their blocks).
When they weren’t sitting on their backsides, city plowers were caught on videotape maniacally destroying parked vehicles in a futile display of Kabuki Emergency Theater. It would be laugh-out-loud comedy if not for the death of at least one newborn whose parents waited for an ambulance that never came because of snowed-in streets.
This isn’t a triumphant victory for social justice and workers’ dignity. This is terrifying criminal negligence.
And it isn’t the first time New York City sanitation workers have endangered residents’ well-being. In the 1960s, a Teamsters-affiliated sanitation workers’ strike led to trash fires, typhoid warnings, and rat infestations, as 100,000 tons of rotting garbage piled up. Three decades later, a coordinated job action by city building-service workers and sanitation workers caused another public trash nuisance declared “dangerous to life and health” in the Big Apple.
New Yorkers could learn a thing or two from those of us who call Colorado Springs home. We have no fear of being held hostage to a politically driven sanitation department — because we have no sanitation department. We have no sanitation department because enlightened advocates of limited government in our town realized that competitive bidders in the private sector could provide better service at lower cost.
And we’re not alone. As the Mackinac Center for Public Policy in Michigan reported: “The largest study ever conducted on outsourced garbage collection, conducted by the federal government in the 1970s, reported 29 to 37 percent savings in cities with populations over 50,000. A 1994 study by the Reason Foundation discovered that the city of Los Angeles was paying about 30 percent more for garbage collection than its surrounding suburbs, in which private waste haulers were employed. A 1982 study of city garbage collection in Canada discovered an astonishing 50 percent average savings as a result of privatization.”
Completely privatized trash collection means city residents don’t get socked with the bill for fraudulently engineered overtime pay, inflated pensions, and gold-plated health benefits in perpetuity — not to mention the capital and operating costs of vehicles and equipment. The Colorado Springs model, as city councilman Sean Paige calls it, is a blueprint for how every city can cope with budget adversity while freeing itself from thuggish union threats when contracts expire or cuts are made. Those who dawdled on privatization efforts in better times are suffering dire, deadly consequences now.
Let the snow-choked streets of New York be a lesson for the rest of the nation: It’s time to put the Big Chill on Big Labor–run municipal services.
Thursday, December 30, 2010
The American 21st Century
America’s rivals lack the culture necessary to sustain greatness.
Victor Davis Hanson
Thursday, December 30, 2010
The current debt, recession, wars, and political infighting have depressed Americans into thinking their country soon will be overtaken by more vigorous rivals abroad. Yet this is an American fear as old as it is improbable.
In the 1930s, the Great Depression supposedly marked the end of freewheeling American capitalism. The 1950s were caricatured as a period of mindless American conformity, McCarthyism, and obsequious company men.
By the late 1960s, the assassinations of John and Robert Kennedy and Martin Luther King Jr., along with the Vietnam War, had fueled a hippie counterculture that purportedly was going to replace a toxic American establishment. In the 1970s, oil shocks, gas lines, Watergate, and new rustbelts were said to be symptomatic of a post-industrial, has-been America.
At the same time, other nations, we were typically told, were doing far better.
In the late 1940s, with the rise of a postwar Soviet Union that had crushed Hitler’s Wehrmacht on the eastern front during World War II, Communism promised a New Man as it swept through Eastern Europe.
Mao Zedong took power in China and inspired Communist revolutions from North Korea to Cuba. Statist central planning was going to replace the unfairness and inefficiency of Western-style capitalism. Yet just a half-century later, Communism had either imploded or been superseded in most of the world.
By the early 1980s, Japan’s state capitalism along with emphasis on the group rather than the individual was being touted as the ideal balance between the public and private sectors. Japan Inc. continually outpaced the growth of the American economy. Then, in the 1990s, a real-estate bubble and a lack of fiscal transparency led to a collapse of property prices and a general recession. A shrinking and aging Japanese population, led by a secretive government, has been struggling ever since to recover the old magic.
At the beginning of the 21st century, the European Union was hailed as the proper Western paradigm of the future. The euro soared over the dollar. Europe practiced a sophisticated “soft power,” while American cowboyism was derided for getting us into wars in Afghanistan and Iraq. Civilized cradle-to-grave benefits were contrasted with the frontier, every-man-for-himself American system.
Now Europe limps from crisis to crisis. Its undemocratic union, coupled with socialist entitlements, is proving unsustainable. Symptoms of the ossified European system appear in everything from a shrinking population and a growing atheism to an inability to integrate Muslim immigrants or field a credible military.
As we enter this new decade, we are being lectured that China is soon to be the global colossus. Its economy is now second only to America’s, but with a far faster rate of growth and with budget surpluses rather than debt. Few seem to mention that China’s mounting social tensions, mercantilism, environmental degradation, and state bosses belong more to a 19th- than a 21st-century nation.
Amid all this doom and gloom, two factors are constant over the decades. First, America goes through periodic bouts of neurotic self-doubt, only to wake up and snap out of it. Indeed, indebted Americans are already bracing for fiscal restraint and parsimony as an antidote to past profligacy.
Second, decline is relative and does not occur in a vacuum. As Western economic and scientific values ripple out from Europe and the United States, it is understandable that developing countries like China, India, and Brazil can catapult right into the 21st century. But that said, national strength is still measured by the underlying hardiness of the patient — its demography, culture, and institutions — rather than by occasional symptoms of ill health.
In that regard, America integrates immigrants and assimilates races and ethnicities in a way Europe cannot. Russia, China, and Japan are simply not culturally equipped to deal with millions who do not look Slavic, Chinese, or Japanese. The Islamic world cannot ensure religious parity to Christians, Jews, or Hindus — or political equality to women.
The American Constitution has been tested over 223 years. In contrast, China, the European Union, India, Japan, Russia, and South Korea have constitutional pedigrees of not much more than 60 years. The last time Americans killed each other in large numbers was nearly a century and a half ago; most of our rivals have seen millions of their own destroyed in civil strife and internecine warring just this past century.
In short, a nation’s health is gauged not by bouts of recession and self-doubt, but by the durability of its political, economic, military, and social foundations. A temporarily ill-seeming America is nevertheless still growing, stable, multiethnic, transparent, individualistic, self-critical, and meritocratic; almost all of its apparently healthy rivals, by contrast, are not.
Victor Davis Hanson
Thursday, December 30, 2010
The current debt, recession, wars, and political infighting have depressed Americans into thinking their country soon will be overtaken by more vigorous rivals abroad. Yet this is an American fear as old as it is improbable.
In the 1930s, the Great Depression supposedly marked the end of freewheeling American capitalism. The 1950s were caricatured as a period of mindless American conformity, McCarthyism, and obsequious company men.
By the late 1960s, the assassinations of John and Robert Kennedy and Martin Luther King Jr., along with the Vietnam War, had fueled a hippie counterculture that purportedly was going to replace a toxic American establishment. In the 1970s, oil shocks, gas lines, Watergate, and new rustbelts were said to be symptomatic of a post-industrial, has-been America.
At the same time, other nations, we were typically told, were doing far better.
In the late 1940s, with the rise of a postwar Soviet Union that had crushed Hitler’s Wehrmacht on the eastern front during World War II, Communism promised a New Man as it swept through Eastern Europe.
Mao Zedong took power in China and inspired Communist revolutions from North Korea to Cuba. Statist central planning was going to replace the unfairness and inefficiency of Western-style capitalism. Yet just a half-century later, Communism had either imploded or been superseded in most of the world.
By the early 1980s, Japan’s state capitalism along with emphasis on the group rather than the individual was being touted as the ideal balance between the public and private sectors. Japan Inc. continually outpaced the growth of the American economy. Then, in the 1990s, a real-estate bubble and a lack of fiscal transparency led to a collapse of property prices and a general recession. A shrinking and aging Japanese population, led by a secretive government, has been struggling ever since to recover the old magic.
At the beginning of the 21st century, the European Union was hailed as the proper Western paradigm of the future. The euro soared over the dollar. Europe practiced a sophisticated “soft power,” while American cowboyism was derided for getting us into wars in Afghanistan and Iraq. Civilized cradle-to-grave benefits were contrasted with the frontier, every-man-for-himself American system.
Now Europe limps from crisis to crisis. Its undemocratic union, coupled with socialist entitlements, is proving unsustainable. Symptoms of the ossified European system appear in everything from a shrinking population and a growing atheism to an inability to integrate Muslim immigrants or field a credible military.
As we enter this new decade, we are being lectured that China is soon to be the global colossus. Its economy is now second only to America’s, but with a far faster rate of growth and with budget surpluses rather than debt. Few seem to mention that China’s mounting social tensions, mercantilism, environmental degradation, and state bosses belong more to a 19th- than a 21st-century nation.
Amid all this doom and gloom, two factors are constant over the decades. First, America goes through periodic bouts of neurotic self-doubt, only to wake up and snap out of it. Indeed, indebted Americans are already bracing for fiscal restraint and parsimony as an antidote to past profligacy.
Second, decline is relative and does not occur in a vacuum. As Western economic and scientific values ripple out from Europe and the United States, it is understandable that developing countries like China, India, and Brazil can catapult right into the 21st century. But that said, national strength is still measured by the underlying hardiness of the patient — its demography, culture, and institutions — rather than by occasional symptoms of ill health.
In that regard, America integrates immigrants and assimilates races and ethnicities in a way Europe cannot. Russia, China, and Japan are simply not culturally equipped to deal with millions who do not look Slavic, Chinese, or Japanese. The Islamic world cannot ensure religious parity to Christians, Jews, or Hindus — or political equality to women.
The American Constitution has been tested over 223 years. In contrast, China, the European Union, India, Japan, Russia, and South Korea have constitutional pedigrees of not much more than 60 years. The last time Americans killed each other in large numbers was nearly a century and a half ago; most of our rivals have seen millions of their own destroyed in civil strife and internecine warring just this past century.
In short, a nation’s health is gauged not by bouts of recession and self-doubt, but by the durability of its political, economic, military, and social foundations. A temporarily ill-seeming America is nevertheless still growing, stable, multiethnic, transparent, individualistic, self-critical, and meritocratic; almost all of its apparently healthy rivals, by contrast, are not.
Labels:
America's Role,
China,
Economy,
Europe,
Ignorance,
Recommended Reading,
Spirit
She Told Us So
By Cal Thomas
Thursday, December 30, 2010
Sarah Palin deserves an apology. When she said that the new health-care law would lead to "death panels" deciding who gets life-saving treatment and who does not, she was roundly denounced and ridiculed.
Now we learn, courtesy of one of the ridiculers -- The New York Times -- that she was right. Under a new policy not included in the law for fear the administration's real end-of-life game would be exposed, a rule issued by the recess-appointed Dr. Donald M. Berwick, administrator of the Centers for Medicare and Medicaid Services, calls for the government to pay doctors to advise patients on options for ending their lives. These could include directives to forgo aggressive treatment that could extend their lives.
This rule will inevitably lead to bureaucrats deciding who is "fit" to live and who is not. The effect this might have on public opinion, which by a solid majority opposes Obamacare, is clear from an e-mail obtained by the Times. It is from Rep. Earl Blumenauer (D-Ore.), who sent it to people working with him on the issue. Oregon and Washington are the only states with assisted-suicide laws, a preview of what is to come at the federal level if this new regulation is allowed to stand. Blumenauer wrote in his November e-mail: "While we are very happy with the result, we won't be shouting it from the rooftops because we aren't out of the woods yet. This regulation could be modified or reversed, especially if Republican leaders try to use this small provision to perpetuate the 'death panel' myth."
Ah, but it's not a myth, and that's where Palin nailed it. All inhumanities begin with small steps; otherwise the public might rebel against a policy that went straight to the "final solution." All human life was once regarded as having value, because even government saw it as "endowed by our Creator." This doctrine separates us from plants, microorganisms and animals.
Doctors once swore an oath, which reads in part: "I will not give a lethal drug to anyone if I am asked, nor will I advise such a plan; and similarly I will not give a woman a pessary to cause an abortion." Did Dr. Berwick, a fan of rationed care and the British National Health Service, ever take that oath? If he did, it appears he no longer believes it.
Do you see where this leads? First the prohibition against abortion is removed and "doctors" now perform them. Then the assault on the infirm and elderly begins. Once the definition of human life changes, all human lives become potentially expendable if they don't measure up to constantly "evolving" government standards.
It will all be dressed up with the best possible motives behind it and sold to the public as the ultimate benefit. The killings, uh, terminations, will take place out of sight so as not to disturb the masses who might have a few embers of a past morality still burning in their souls. People will sign documents testifying to their desire to die, and the government will see it as a means of "reducing the surplus population," to quote Charles Dickens.
When life is seen as having ultimate value, individuals and their doctors can make decisions about treatment that are in the best interests of patients. But when government is looking to cut costs as the highest good and offers to pay doctors to tell patients during their annual visits that they can choose to end their lives rather than continue treatment, that is more than the proverbial camel's nose under the tent. That is the next step on the way to physician-assisted suicide and, if not stopped, government-mandated euthanasia.
It can't happen here? Based on what standard? Yes it can happen in America, and it will if the new Republican class in Congress doesn't stop it.
Thursday, December 30, 2010
Sarah Palin deserves an apology. When she said that the new health-care law would lead to "death panels" deciding who gets life-saving treatment and who does not, she was roundly denounced and ridiculed.
Now we learn, courtesy of one of the ridiculers -- The New York Times -- that she was right. Under a new policy not included in the law for fear the administration's real end-of-life game would be exposed, a rule issued by the recess-appointed Dr. Donald M. Berwick, administrator of the Centers for Medicare and Medicaid Services, calls for the government to pay doctors to advise patients on options for ending their lives. These could include directives to forgo aggressive treatment that could extend their lives.
This rule will inevitably lead to bureaucrats deciding who is "fit" to live and who is not. The effect this might have on public opinion, which by a solid majority opposes Obamacare, is clear from an e-mail obtained by the Times. It is from Rep. Earl Blumenauer (D-Ore.), who sent it to people working with him on the issue. Oregon and Washington are the only states with assisted-suicide laws, a preview of what is to come at the federal level if this new regulation is allowed to stand. Blumenauer wrote in his November e-mail: "While we are very happy with the result, we won't be shouting it from the rooftops because we aren't out of the woods yet. This regulation could be modified or reversed, especially if Republican leaders try to use this small provision to perpetuate the 'death panel' myth."
Ah, but it's not a myth, and that's where Palin nailed it. All inhumanities begin with small steps; otherwise the public might rebel against a policy that went straight to the "final solution." All human life was once regarded as having value, because even government saw it as "endowed by our Creator." This doctrine separates us from plants, microorganisms and animals.
Doctors once swore an oath, which reads in part: "I will not give a lethal drug to anyone if I am asked, nor will I advise such a plan; and similarly I will not give a woman a pessary to cause an abortion." Did Dr. Berwick, a fan of rationed care and the British National Health Service, ever take that oath? If he did, it appears he no longer believes it.
Do you see where this leads? First the prohibition against abortion is removed and "doctors" now perform them. Then the assault on the infirm and elderly begins. Once the definition of human life changes, all human lives become potentially expendable if they don't measure up to constantly "evolving" government standards.
It will all be dressed up with the best possible motives behind it and sold to the public as the ultimate benefit. The killings, uh, terminations, will take place out of sight so as not to disturb the masses who might have a few embers of a past morality still burning in their souls. People will sign documents testifying to their desire to die, and the government will see it as a means of "reducing the surplus population," to quote Charles Dickens.
When life is seen as having ultimate value, individuals and their doctors can make decisions about treatment that are in the best interests of patients. But when government is looking to cut costs as the highest good and offers to pay doctors to tell patients during their annual visits that they can choose to end their lives rather than continue treatment, that is more than the proverbial camel's nose under the tent. That is the next step on the way to physician-assisted suicide and, if not stopped, government-mandated euthanasia.
It can't happen here? Based on what standard? Yes it can happen in America, and it will if the new Republican class in Congress doesn't stop it.
Wednesday, December 29, 2010
As Gay Becomes Bourgeois
An ironic progressive victory
Jonah Goldberg
Wednesday, December 29, 2010
So now openly gay soldiers get to fight and die in neocon-imperialist wars too?
David Brooks saw such ironic progressive victories coming. In his book Bobos in Paradise, he wrote that everything “transgressive” gets “digested by the mainstream bourgeois order, and all the cultural weapons that once were used to undermine middle-class morality . . . are drained of their subversive content.”
Two decades ago, the gay Left wanted to smash the bourgeois prisons of monogamy, capitalistic enterprise, and patriotic values and bask in the warm sun of bohemian “free love” and avant-garde values. In this, they were simply picking up the torch from the straight Left of the 1960s and 1970s, who had sought to throw off the sexual hang-ups of their parents’ generation along with their gray flannel suits.
As a sexual-lifestyle experiment, they failed pretty miserably, the greatest proof being that the affluent and educated children (and grandchildren) of the baby boomers have re-embraced the bourgeois notion of marriage as an essential part of a successful life. Sadly, it’s the lower-middle class that increasingly sees marriage as an out-of-reach luxury. The irony is that such bourgeois values — monogamy, hard work, etc. — are the best guarantors of success and happiness.
Of course, the lunacy of the bohemian free-love shtick should have been obvious from the get-go. For instance, when Michael Lerner, a member of the anti–Vietnam War “Seattle Seven,” did marry, in 1971, the couple exchanged rings made from the fuselage of a U.S. aircraft downed over Vietnam and cut into a cake inscribed in icing with a Weatherman catchphrase, “Smash Monogamy.”
Today Lerner is a (divorced and remarried) somewhat preposterous, prosperous progressive rabbi who officiates at all kinds of marriages — gay and straight — and, like pretty much the entire Left, loves the idea of open gays becoming cogs in the military-industrial complex.
The gay experiment with open bohemianism was arguably shorter. Of course, AIDS played an obvious and tragic role in focusing attention on the downside of promiscuity. But even so, the sweeping embrace of bourgeois lifestyles by the gay community has been stunning.
Nowhere is this more evident — and perhaps exaggerated — than in popular culture. Watch ABC’s Modern Family. The sitcom is supposed to be “subversive” in part because it features a gay couple with an adopted daughter from Asia. And you can see why both liberal proponents and conservative opponents of gay marriage see it that way. But imagine you hate the institution of marriage and then watch Modern Family’s hardworking bourgeois gay couple through those eyes. What’s being subverted? Traditional marriage, or some bohemian identity-politics fantasy of homosexuality?
By the way, according to a recent study, Modern Family is the No. 1 sitcom among Republicans (and the third show overall behind Glenn Beck and The Amazing Race) but not even in the top 15 among Democrats, who prefer darker shows like Showtime’s Dexter, about a serial killer trying to balance work and family between murders.
Or look at the decision to let gays openly serve in the military through the eyes of a principled hater of all things military. From that perspective, gays have just been co-opted by The Man. Meanwhile, the folks who used Don’t Ask, Don’t Tell as an excuse to keep the military from recruiting on campuses just saw their argument go up in flames.
Personally, I have always felt that gay marriage was an inevitability, for good or ill (most likely both). I do not think that the arguments against gay marriage are all grounded in bigotry, and I find some of the arguments persuasive. But I also find it cruel and absurd to tell gays that living the free-love lifestyle is abominable while at the same time telling them that their committed relationships are illegitimate too.
Many of my conservative friends — who oppose both civil unions and gay marriage and object to rampant promiscuity — often act as if there’s some grand alternative lifestyle for gays. But there isn’t. And given that open homosexuality is simply a fact of life, the rise of the HoBos — the homosexual bourgeoisie — strikes me as good news.
Jonah Goldberg
Wednesday, December 29, 2010
So now openly gay soldiers get to fight and die in neocon-imperialist wars too?
David Brooks saw such ironic progressive victories coming. In his book Bobos in Paradise, he wrote that everything “transgressive” gets “digested by the mainstream bourgeois order, and all the cultural weapons that once were used to undermine middle-class morality . . . are drained of their subversive content.”
Two decades ago, the gay Left wanted to smash the bourgeois prisons of monogamy, capitalistic enterprise, and patriotic values and bask in the warm sun of bohemian “free love” and avant-garde values. In this, they were simply picking up the torch from the straight Left of the 1960s and 1970s, who had sought to throw off the sexual hang-ups of their parents’ generation along with their gray flannel suits.
As a sexual-lifestyle experiment, they failed pretty miserably, the greatest proof being that the affluent and educated children (and grandchildren) of the baby boomers have re-embraced the bourgeois notion of marriage as an essential part of a successful life. Sadly, it’s the lower-middle class that increasingly sees marriage as an out-of-reach luxury. The irony is that such bourgeois values — monogamy, hard work, etc. — are the best guarantors of success and happiness.
Of course, the lunacy of the bohemian free-love shtick should have been obvious from the get-go. For instance, when Michael Lerner, a member of the anti–Vietnam War “Seattle Seven,” did marry, in 1971, the couple exchanged rings made from the fuselage of a U.S. aircraft downed over Vietnam and cut into a cake inscribed in icing with a Weatherman catchphrase, “Smash Monogamy.”
Today Lerner is a (divorced and remarried) somewhat preposterous, prosperous progressive rabbi who officiates at all kinds of marriages — gay and straight — and, like pretty much the entire Left, loves the idea of open gays becoming cogs in the military-industrial complex.
The gay experiment with open bohemianism was arguably shorter. Of course, AIDS played an obvious and tragic role in focusing attention on the downside of promiscuity. But even so, the sweeping embrace of bourgeois lifestyles by the gay community has been stunning.
Nowhere is this more evident — and perhaps exaggerated — than in popular culture. Watch ABC’s Modern Family. The sitcom is supposed to be “subversive” in part because it features a gay couple with an adopted daughter from Asia. And you can see why both liberal proponents and conservative opponents of gay marriage see it that way. But imagine you hate the institution of marriage and then watch Modern Family’s hardworking bourgeois gay couple through those eyes. What’s being subverted? Traditional marriage, or some bohemian identity-politics fantasy of homosexuality?
By the way, according to a recent study, Modern Family is the No. 1 sitcom among Republicans (and the third show overall behind Glenn Beck and The Amazing Race) but not even in the top 15 among Democrats, who prefer darker shows like Showtime’s Dexter, about a serial killer trying to balance work and family between murders.
Or look at the decision to let gays openly serve in the military through the eyes of a principled hater of all things military. From that perspective, gays have just been co-opted by The Man. Meanwhile, the folks who used Don’t Ask, Don’t Tell as an excuse to keep the military from recruiting on campuses just saw their argument go up in flames.
Personally, I have always felt that gay marriage was an inevitability, for good or ill (most likely both). I do not think that the arguments against gay marriage are all grounded in bigotry, and I find some of the arguments persuasive. But I also find it cruel and absurd to tell gays that living the free-love lifestyle is abominable while at the same time telling them that their committed relationships are illegitimate too.
Many of my conservative friends — who oppose both civil unions and gay marriage and object to rampant promiscuity — often act as if there’s some grand alternative lifestyle for gays. But there isn’t. And given that open homosexuality is simply a fact of life, the rise of the HoBos — the homosexual bourgeoisie — strikes me as good news.
Labels:
Conservatives,
Hypocrisy,
Ignorance,
Liberals,
Recommended Reading,
Tendency
Tuesday, December 28, 2010
Back to Declinism
The world will be worse without a liberal hegemon.
Michael Auslin
Tuesday, December 28, 2010
In the judgment of Yale historian Paul Kennedy, a world in which a shrunken America is just primus inter pares, “one of the most prominent players in the small club of great powers,” is all but inevitable, a natural turning of the seasons. While many on the left would welcome such an egalitarian future — it is not “a bad thing,” Kennedy claims — the rest of the world, especially liberal-democratic nations, may quibble just a bit with this rather prosaic and utilitarian view of global power.
Kennedy — who was my colleague at Yale for most of the last decade — made his reputation by limning the “rise and fall of great powers,” and his most recent article in The New Republic, “Back to Normalcy,” is but a variant of this oft-played theme. Galloping widely through the last half-millennium of Western history, he purports to show how America’s global position rests on an increasingly unstable three-legged stool of “soft power,” economic power, and military power. Each is eroding as other nations rise. In Kennedy’s telling, the ability to challenge America for regional or possibly global leadership is merely a matter of aping American models and asserting the national will.
This view may not be incorrect in tracing current trends and perhaps even in sketching the rough contours of the near future. Yet this argument lacks any moral component and is overly dismissive of the sources of both domestic power and global stability.
In the clinical view that Kennedy takes, the United States (and before it, Great Britain) is, in the end, simply interchangeable with all previous and future great powers, and its unique domestic society and global behavior are but epiphenomenal. The best that Kennedy can do is a grudging acknowledgment that “we should all be careful to wish away a reasonably benign American hegemony; we might regret its going.” Such are the wages paid to nearly two centuries of liberal growth and international stewardship.
Nowhere in Kennedy’s ruminations do the words “liberty” or “freedom” appear, and he mentions democracy only once, when recounting Harvard political scientist Joseph Nye’s analysis of the global effects of American power. The indivisible links among a given society’s domestic structure, its national strength, and its international position are thus dismissed.
There is no recognition either that national greatness over the long term must come from the character of a given society or that the nature of the hegemonic country will determine to a great degree the nature of the international system. It is no accident that Great Britain and the United States became so powerful and stable. Surely, the unique combination of ever-more-efficient capitalist financing and production along with slowly expanding democracy should be accounted the prime source of Anglo-American greatness.
To this potent mix was accreted generations of evolution in social trust, transparency in economic interactions, and the firming of the bedrock of law. All of these accelerated industrial development, and thereby human wellbeing, more evenly and for a longer period of time than did any other system in history. To treat them as but an incidental factor in the rise to power of Great Britain and the United States is to eviscerate one’s explanatory scheme and make any predictions a risky gamble. China’s attempt to continue growing without developing mechanisms of self-expression, trust, and impartial law is an experiment the world will watch keenly over the coming decades, but only a reckless prognosticator would assume that such growth is inevitable. It is not so simple as playing “catch-up,” in Paul Kennedy’s words, even if America’s relative economic strength will indeed decline over time.
Kennedy’s thesis also glides over the sources of international norms, stability, and the provision of global public goods. On the global stage, countries ruled by law and in which the people are sovereign act very differently from those ruled by an oligarchy. Kennedy’s realist vision is one in which states are but indistinguishable billiard balls, knocking into each other on a cosmic pool table. Yet cultural preferences must surely explain the post-1945 international behavior of Washington, London, Ottawa, and Canberra, to name a few. Providing public goods through such actions as ensuring freedom of the seas, hosting and nurturing international organizations, and supporting democracy worldwide marked the second half of the 20th century as one of the most progressive in history. No one can think that today’s liberal international order, as flawed and often ineffective as it is, is less preferable to a 19th-century colonialist balance-of-power system, or to a future dominated by China, Russia, Iran, and Venezuela.
Kennedy’s argument further presumes that America’s hegemonic desires alone are the cause of the spread of democracy around the globe. He is dead wrong when he writes that “American hopes of reshaping Asia sometimes look curiously like former British hopes of reshaping the Middle East. Don’t go there.” Democratic triumphs in Taiwan, South Korea, Mongolia, and the Philippines (to say nothing of East Europe nearly a generation ago) did not originate in Washington, D.C., though American moral and material aid certainly helped stabilize new representative regimes. To ignore the liberal striving of millions is both to cruelly dismiss their collective courage and to set up the grounds for stripping American foreign policy of its moral wellspring. By continuing to acknowledge and aid the genuine aspirations for greater democracy in many countries, America and other liberal nations not only live up to their ideals, but more effectively marshal their resources and help swell the tide of freedom.
The “normal” world envisioned by Kennedy also apparently eschews such distinctions as “authoritarianism” and “totalitarianism,” which get no mention in his argument. Should the world accept aggressive behavior in the global commons simply because that reflects the nature of the most powerful actors? Is such a world preferable to one in which the United States promotes liberal values even if it has outsized capabilities and influence? A world in which America is but one leading power among others will very likely witness reduced levels of social and economic development and political stability. History should make us very way of assuming that a world without a liberal hegemon will be one in which liberal values and benign stewardship continue to shape the international system, especially when many of the most powerful actors are authoritarian.
In the end, Paul Kennedy may be right about the current trajectory of global politics. However, we should all recognize that this will be a terrible state of affairs. It is one thing to see that such a world might be emerging, but another thing to welcome it. Kennedy’s deep ambivalence about the positive role of the United States both now and in the future is matched by the realpolitik he appears to assume is not merely normal, but perhaps even preferable to a system in which an imperfect America attempts to nourish the liberal values that have made our world a far more humane place than it was at any other time in history.
Michael Auslin
Tuesday, December 28, 2010
In the judgment of Yale historian Paul Kennedy, a world in which a shrunken America is just primus inter pares, “one of the most prominent players in the small club of great powers,” is all but inevitable, a natural turning of the seasons. While many on the left would welcome such an egalitarian future — it is not “a bad thing,” Kennedy claims — the rest of the world, especially liberal-democratic nations, may quibble just a bit with this rather prosaic and utilitarian view of global power.
Kennedy — who was my colleague at Yale for most of the last decade — made his reputation by limning the “rise and fall of great powers,” and his most recent article in The New Republic, “Back to Normalcy,” is but a variant of this oft-played theme. Galloping widely through the last half-millennium of Western history, he purports to show how America’s global position rests on an increasingly unstable three-legged stool of “soft power,” economic power, and military power. Each is eroding as other nations rise. In Kennedy’s telling, the ability to challenge America for regional or possibly global leadership is merely a matter of aping American models and asserting the national will.
This view may not be incorrect in tracing current trends and perhaps even in sketching the rough contours of the near future. Yet this argument lacks any moral component and is overly dismissive of the sources of both domestic power and global stability.
In the clinical view that Kennedy takes, the United States (and before it, Great Britain) is, in the end, simply interchangeable with all previous and future great powers, and its unique domestic society and global behavior are but epiphenomenal. The best that Kennedy can do is a grudging acknowledgment that “we should all be careful to wish away a reasonably benign American hegemony; we might regret its going.” Such are the wages paid to nearly two centuries of liberal growth and international stewardship.
Nowhere in Kennedy’s ruminations do the words “liberty” or “freedom” appear, and he mentions democracy only once, when recounting Harvard political scientist Joseph Nye’s analysis of the global effects of American power. The indivisible links among a given society’s domestic structure, its national strength, and its international position are thus dismissed.
There is no recognition either that national greatness over the long term must come from the character of a given society or that the nature of the hegemonic country will determine to a great degree the nature of the international system. It is no accident that Great Britain and the United States became so powerful and stable. Surely, the unique combination of ever-more-efficient capitalist financing and production along with slowly expanding democracy should be accounted the prime source of Anglo-American greatness.
To this potent mix was accreted generations of evolution in social trust, transparency in economic interactions, and the firming of the bedrock of law. All of these accelerated industrial development, and thereby human wellbeing, more evenly and for a longer period of time than did any other system in history. To treat them as but an incidental factor in the rise to power of Great Britain and the United States is to eviscerate one’s explanatory scheme and make any predictions a risky gamble. China’s attempt to continue growing without developing mechanisms of self-expression, trust, and impartial law is an experiment the world will watch keenly over the coming decades, but only a reckless prognosticator would assume that such growth is inevitable. It is not so simple as playing “catch-up,” in Paul Kennedy’s words, even if America’s relative economic strength will indeed decline over time.
Kennedy’s thesis also glides over the sources of international norms, stability, and the provision of global public goods. On the global stage, countries ruled by law and in which the people are sovereign act very differently from those ruled by an oligarchy. Kennedy’s realist vision is one in which states are but indistinguishable billiard balls, knocking into each other on a cosmic pool table. Yet cultural preferences must surely explain the post-1945 international behavior of Washington, London, Ottawa, and Canberra, to name a few. Providing public goods through such actions as ensuring freedom of the seas, hosting and nurturing international organizations, and supporting democracy worldwide marked the second half of the 20th century as one of the most progressive in history. No one can think that today’s liberal international order, as flawed and often ineffective as it is, is less preferable to a 19th-century colonialist balance-of-power system, or to a future dominated by China, Russia, Iran, and Venezuela.
Kennedy’s argument further presumes that America’s hegemonic desires alone are the cause of the spread of democracy around the globe. He is dead wrong when he writes that “American hopes of reshaping Asia sometimes look curiously like former British hopes of reshaping the Middle East. Don’t go there.” Democratic triumphs in Taiwan, South Korea, Mongolia, and the Philippines (to say nothing of East Europe nearly a generation ago) did not originate in Washington, D.C., though American moral and material aid certainly helped stabilize new representative regimes. To ignore the liberal striving of millions is both to cruelly dismiss their collective courage and to set up the grounds for stripping American foreign policy of its moral wellspring. By continuing to acknowledge and aid the genuine aspirations for greater democracy in many countries, America and other liberal nations not only live up to their ideals, but more effectively marshal their resources and help swell the tide of freedom.
The “normal” world envisioned by Kennedy also apparently eschews such distinctions as “authoritarianism” and “totalitarianism,” which get no mention in his argument. Should the world accept aggressive behavior in the global commons simply because that reflects the nature of the most powerful actors? Is such a world preferable to one in which the United States promotes liberal values even if it has outsized capabilities and influence? A world in which America is but one leading power among others will very likely witness reduced levels of social and economic development and political stability. History should make us very way of assuming that a world without a liberal hegemon will be one in which liberal values and benign stewardship continue to shape the international system, especially when many of the most powerful actors are authoritarian.
In the end, Paul Kennedy may be right about the current trajectory of global politics. However, we should all recognize that this will be a terrible state of affairs. It is one thing to see that such a world might be emerging, but another thing to welcome it. Kennedy’s deep ambivalence about the positive role of the United States both now and in the future is matched by the realpolitik he appears to assume is not merely normal, but perhaps even preferable to a system in which an imperfect America attempts to nourish the liberal values that have made our world a far more humane place than it was at any other time in history.
Labels:
America's Role,
Anti-Americanism,
Hypocrisy,
Ignorance,
Liberals,
Policy,
Recommended Reading,
Spirit
Monday, December 27, 2010
A Very "Bad Day" For an American
By Bruce Bialosky
Monday, December 27, 2010
Complaining could be the bane of a columnist. Most of what we do is observe other people and tell them what they’re doing wrong. Despite that I want to share with you a “horrible” day.
Recently, my wife and I went to a UCLA basketball game. I was already in a grumpy mood, having to get dressed and go out on a rainy Sunday night. UCLA had just returned from a heartbreaking loss at Kansas during which they showed signs that this year’s team was on the rebound from last year’s disappointing season. But they just blew this game amidst a truly poor outing. The 9th ranked women’s team would have likely fared better. Like the rest of the crowd, we left the arena early in a very dejected frame of mind.
When we arrived home, we found that the power was out in our neighborhood. In Los Angeles, a little rain wreaks havoc. We always enter our house through the garage and never use a house key, but without any power our garage door wouldn’t open, and we had neglected to take our house key with us. We spent a few moments in our driveway, wistfully looking at our lovely but powerless house that we couldn’t enter. Ultimately, we were able to contact our house-sitter who thankfully had a key. The 30 minute round-trip drive proved to be a waste as we returned to home to find the power restored. On top of that, I found that my article had not been posted by my editor, meaning it wouldn’t be available per normal on Monday. With that, I wondered what would be the next shoe to drop.
For most Americans, this is what constitutes a bad day. Before someone spouts off about how insensitive I am about the struggles that people face, let me be clear that I am well aware of the suffering in this country. There are thousands of Americans who have lost their jobs and homes in this economic downturn, or are fighting horrible diseases, or were born into either severe economic conditions or to irresponsible parents.
But let’s face it; the worst day for most Americans would be a killer day for most of the rest of the world. Tom Friedman, author and New York Times columnist, loves to write about the Chinese surge and how America could learn from Chinese methods. To put things in perspective, the vast majority of mainland Chinese would love to be living at what we call our poverty line. China has more than twice the number of America’s total population who live in unimaginable poverty. When the Chinese surpass us in Gross Domestic Product, it will mean that the average Chinese citizen is living at 25% of our living standard. So really, how bad do we have it?
America faces a lot of challenges, but what’s new? We have confronted challenges before and have overcome them. Whenever we’ve been told that our country was falling down a sinkhole, we have woken up to say “This is America – we have to fix things because we owe it to the next generation.”
Isn’t that what happened over the last two years, as Americans realized that their government cannot and should not do everything? We don’t want to be like Europe! So on November 2nd, we stood up and said “Halt!”
Even while 63% of Americans believe that this country is going in the wrong direction, we cannot depend on our elected officials to change that course. We must stop and declare that this country means something, something different than any country ever created, and that we demand to keep it that way.
I am never pessimistic about this country. I am always thankful for the right to live here and my birthright as an American. If you ever forget what this country is about, do something to connect to our basic values. You don’t need to recall the heroes of our past or focus on the miracles of our creative economy or applaud our Nobel Prize winners. Next fall spend your Friday night at a high school football game or a Saturday at a college football game. You will reconnect with the sense of community and core values of America and be reminded that we are different and that we have something special.
During this holiday season, be thankful that we have created this exceptional country and way of life. Now this Jewish boy is going to listen to some wonderful Christmas music, enjoy my family, and play with my puppies. And remember, it’s only a little over a month until pitchers and catchers report for spring training.
Monday, December 27, 2010
Complaining could be the bane of a columnist. Most of what we do is observe other people and tell them what they’re doing wrong. Despite that I want to share with you a “horrible” day.
Recently, my wife and I went to a UCLA basketball game. I was already in a grumpy mood, having to get dressed and go out on a rainy Sunday night. UCLA had just returned from a heartbreaking loss at Kansas during which they showed signs that this year’s team was on the rebound from last year’s disappointing season. But they just blew this game amidst a truly poor outing. The 9th ranked women’s team would have likely fared better. Like the rest of the crowd, we left the arena early in a very dejected frame of mind.
When we arrived home, we found that the power was out in our neighborhood. In Los Angeles, a little rain wreaks havoc. We always enter our house through the garage and never use a house key, but without any power our garage door wouldn’t open, and we had neglected to take our house key with us. We spent a few moments in our driveway, wistfully looking at our lovely but powerless house that we couldn’t enter. Ultimately, we were able to contact our house-sitter who thankfully had a key. The 30 minute round-trip drive proved to be a waste as we returned to home to find the power restored. On top of that, I found that my article had not been posted by my editor, meaning it wouldn’t be available per normal on Monday. With that, I wondered what would be the next shoe to drop.
For most Americans, this is what constitutes a bad day. Before someone spouts off about how insensitive I am about the struggles that people face, let me be clear that I am well aware of the suffering in this country. There are thousands of Americans who have lost their jobs and homes in this economic downturn, or are fighting horrible diseases, or were born into either severe economic conditions or to irresponsible parents.
But let’s face it; the worst day for most Americans would be a killer day for most of the rest of the world. Tom Friedman, author and New York Times columnist, loves to write about the Chinese surge and how America could learn from Chinese methods. To put things in perspective, the vast majority of mainland Chinese would love to be living at what we call our poverty line. China has more than twice the number of America’s total population who live in unimaginable poverty. When the Chinese surpass us in Gross Domestic Product, it will mean that the average Chinese citizen is living at 25% of our living standard. So really, how bad do we have it?
America faces a lot of challenges, but what’s new? We have confronted challenges before and have overcome them. Whenever we’ve been told that our country was falling down a sinkhole, we have woken up to say “This is America – we have to fix things because we owe it to the next generation.”
Isn’t that what happened over the last two years, as Americans realized that their government cannot and should not do everything? We don’t want to be like Europe! So on November 2nd, we stood up and said “Halt!”
Even while 63% of Americans believe that this country is going in the wrong direction, we cannot depend on our elected officials to change that course. We must stop and declare that this country means something, something different than any country ever created, and that we demand to keep it that way.
I am never pessimistic about this country. I am always thankful for the right to live here and my birthright as an American. If you ever forget what this country is about, do something to connect to our basic values. You don’t need to recall the heroes of our past or focus on the miracles of our creative economy or applaud our Nobel Prize winners. Next fall spend your Friday night at a high school football game or a Saturday at a college football game. You will reconnect with the sense of community and core values of America and be reminded that we are different and that we have something special.
During this holiday season, be thankful that we have created this exceptional country and way of life. Now this Jewish boy is going to listen to some wonderful Christmas music, enjoy my family, and play with my puppies. And remember, it’s only a little over a month until pitchers and catchers report for spring training.
Friday, December 24, 2010
Enumerated Powers Make A Comeback
Judge Hudson believes there is such a thing as limited government.
Mona Charen
Friday, December 24, 2010
It was always for our own good. It was always for a very good reason. It was always within the American tradition of this, that, or the other.
That’s what they’ve told us; that’s how they’ve patronized us, for generations, as the long tendrils of the federal government have spread and multiplied into every realm of American life. It had become so utterly unremarkable, this robotic and seemingly inexorable aggrandizement of federal power, that when Speaker Nancy Pelosi was asked, in 2009, where in the Constitution Congress was granted the authority to force people to buy health insurance, she didn’t even seem to understand the question. “Are you serious?” she asked. “Are you serious?”
But Judge Henry Hudson (don’t you love the historically resonant name?) was very serious when he ruled that the Constitution created a federal government of “enumerated powers,” and that limits on those powers have continuing force. He’s not only serious, he’s cautious and learned. And he represents something we wouldn’t necessarily have predicted back in 2008 when it seemed a new liberal hegemony would unfold over the next 25 years: a principled backlash against federal overreach. Those Tea Party protesters in their Founders costumes may have looked ridiculous to Nancy Pelosi and Harry Reid, but their interest in seemingly antique concepts such as limited government is showing up more and more. In just one month, a federal judge has ruled that the Commerce Clause cannot be stretched to cover absolutely everything the Congress wishes to do, and a chorus of limited-government voices has noisily protested the Federal Communications Commission’s attempt to assert control over the Internet.
It isn’t possible for the former speaker or her allies in the federal juggernaut to dismiss Judge Hudson as “astroturf.” In a carefully reasoned decision, he took note of Congress’s power to regulate under the Commerce Clause. “But these regulatory powers,” he ruled, “are triggered by some type of self-initiated action. Neither the Supreme Court nor any federal circuit court of appeals has extended Commerce Clause powers to compel an individual to involuntarily enter the stream of commerce by purchasing a commodity in the private market.” Judge Hudson continued, “The unchecked expansion of congressional power to the limits suggested would invite unbridled exercise of federal police powers” whereas “Article I, Section 8 of the Constitution confers upon Congress only discrete enumerated governmental powers.”
Together with challenges to the health-care law mounted by 19 states, as well as differing judgments in other jurisdictions, the stage is now set for the Supreme Court to decide the issue. If the court decides, contra Judge Hudson, that the Commerce Clause can indeed be stretched to cover anything, it won’t be the first time. In 1942, the Court (just a few years after FDR’s court-packing plan) ruled that the Commerce Clause could justify the regulation even of intrastate commerce. “The marketing of intrastate milk,” wrote the Court in the 1942 Wrightwood Dairy case, “which competes with that shipped interstate would tend seriously to break down price regulation of the latter.”
An even more far-fetched bit of Court reasoning followed in Wickard v. Filburn (1942), in which the federal government fined a farmer who raised wheat for his own consumption. The rationale: By eating his own wheat, the farmer did not buy wheat, and this non-participation in the market for wheat affected interstate commerce. Those who cannot imagine the Court upholding a requirement that individuals buy a particular product (health insurance) should think again.
Since the New Deal, and particularly during the civil-rights era, the Commerce Clause has been interpreted capaciously to permit the government to do good (actual good in the civil-rights cases, perceived good in the New Deal cases). But no matter what the motive, the effect was to vitiate the Constitution’s principle of enumerated powers. A more limited understanding of the Commerce Clause emerged in the 1990s when the Supreme Court struck down the Gun-Free School Zones Act. The Court’s composition has changed since then.
But the mood of the country is changing too. Everywhere you look, assertions of power are being questioned. When the FCC announced plans to regulate the Internet in the name of so-called “net neutrality,” dozens of congressmen protested that the agency was exceeding its authority. Dissenting commissioner Robert McDowell dubbed it “jaw dropping interventionist chutzpah.” But the comment that captured the new mood of respect for limited powers came from Sen. Mitch McConnell. “The Internet,” he said, “should be left alone.” Yes, for starters.
Mona Charen
Friday, December 24, 2010
It was always for our own good. It was always for a very good reason. It was always within the American tradition of this, that, or the other.
That’s what they’ve told us; that’s how they’ve patronized us, for generations, as the long tendrils of the federal government have spread and multiplied into every realm of American life. It had become so utterly unremarkable, this robotic and seemingly inexorable aggrandizement of federal power, that when Speaker Nancy Pelosi was asked, in 2009, where in the Constitution Congress was granted the authority to force people to buy health insurance, she didn’t even seem to understand the question. “Are you serious?” she asked. “Are you serious?”
But Judge Henry Hudson (don’t you love the historically resonant name?) was very serious when he ruled that the Constitution created a federal government of “enumerated powers,” and that limits on those powers have continuing force. He’s not only serious, he’s cautious and learned. And he represents something we wouldn’t necessarily have predicted back in 2008 when it seemed a new liberal hegemony would unfold over the next 25 years: a principled backlash against federal overreach. Those Tea Party protesters in their Founders costumes may have looked ridiculous to Nancy Pelosi and Harry Reid, but their interest in seemingly antique concepts such as limited government is showing up more and more. In just one month, a federal judge has ruled that the Commerce Clause cannot be stretched to cover absolutely everything the Congress wishes to do, and a chorus of limited-government voices has noisily protested the Federal Communications Commission’s attempt to assert control over the Internet.
It isn’t possible for the former speaker or her allies in the federal juggernaut to dismiss Judge Hudson as “astroturf.” In a carefully reasoned decision, he took note of Congress’s power to regulate under the Commerce Clause. “But these regulatory powers,” he ruled, “are triggered by some type of self-initiated action. Neither the Supreme Court nor any federal circuit court of appeals has extended Commerce Clause powers to compel an individual to involuntarily enter the stream of commerce by purchasing a commodity in the private market.” Judge Hudson continued, “The unchecked expansion of congressional power to the limits suggested would invite unbridled exercise of federal police powers” whereas “Article I, Section 8 of the Constitution confers upon Congress only discrete enumerated governmental powers.”
Together with challenges to the health-care law mounted by 19 states, as well as differing judgments in other jurisdictions, the stage is now set for the Supreme Court to decide the issue. If the court decides, contra Judge Hudson, that the Commerce Clause can indeed be stretched to cover anything, it won’t be the first time. In 1942, the Court (just a few years after FDR’s court-packing plan) ruled that the Commerce Clause could justify the regulation even of intrastate commerce. “The marketing of intrastate milk,” wrote the Court in the 1942 Wrightwood Dairy case, “which competes with that shipped interstate would tend seriously to break down price regulation of the latter.”
An even more far-fetched bit of Court reasoning followed in Wickard v. Filburn (1942), in which the federal government fined a farmer who raised wheat for his own consumption. The rationale: By eating his own wheat, the farmer did not buy wheat, and this non-participation in the market for wheat affected interstate commerce. Those who cannot imagine the Court upholding a requirement that individuals buy a particular product (health insurance) should think again.
Since the New Deal, and particularly during the civil-rights era, the Commerce Clause has been interpreted capaciously to permit the government to do good (actual good in the civil-rights cases, perceived good in the New Deal cases). But no matter what the motive, the effect was to vitiate the Constitution’s principle of enumerated powers. A more limited understanding of the Commerce Clause emerged in the 1990s when the Supreme Court struck down the Gun-Free School Zones Act. The Court’s composition has changed since then.
But the mood of the country is changing too. Everywhere you look, assertions of power are being questioned. When the FCC announced plans to regulate the Internet in the name of so-called “net neutrality,” dozens of congressmen protested that the agency was exceeding its authority. Dissenting commissioner Robert McDowell dubbed it “jaw dropping interventionist chutzpah.” But the comment that captured the new mood of respect for limited powers came from Sen. Mitch McConnell. “The Internet,” he said, “should be left alone.” Yes, for starters.
Islamists’ War against ‘the Other’
Under Islamist pressures, Christians, Jews, and Zoroastrians are vanishing from their ancient homelands.
Nina Shea
Thursday, December 23, 2010
The enduring symbol of Christmas, spanning the world’s diverse Christian cultures and the history of two millennia, is the nativity scene inspired by the gospels of Matthew and Luke. Artistically synthesizing the two gospel stories, the nativity scene is infused with profound Christian meaning and symbolism.
John the Baptist, whose own birth is linked to Jesus’s in Luke’s account, exhorts Christians to “prepare the way of the Lord,” and traditionally many do so during the Christmas season by meditating on these tender devotional scenes. One of the earliest surviving is a 5th-century bas relief from Naxos, Greece. Whether modern nativity scenes are modeled on the famous “live crèches” staged by St. Francis of Assisi in the 13th century, those painted by Renaissance artists, the Baroque Neapolitan crèches (one is displayed in the White House), or simple folk versions, they remain popular worldwide.
This year, one aspect of the nativity scene deserves special reflection. Gathered around the manger that serves as the Christ Child’s cradle are representatives of three ancient religious groups indigenous to the region: Mary and Joseph, the first Christians; the shepherds of Bethlehem, the Jewish “city of David”; and the Magi, the name for Zoroastrian priests, who followed a celestial sign from their home in the East looking for the “King of the Jews.” (Though not depicted in the nativity art, John the Baptist himself attracted many followers, some of whom never converted to Christianity and became known as Sabean Mandeans.)
These figures in the Christmas story represent the principal monotheistic religions of Middle Eastern antiquity. It would not be until six centuries later that Islam arose in the Arabian peninsula. Even today, the Christians, Jews, Zoroastrians, and a group the Zoroastrians inspired, the Yezidis, as well as the Sabean Mandeans, constitute the main non-Islamic religions in the Greater Middle East.
But this is coming to an end. Since 2004, a relentless wave of Islamist terrorist attacks targeting Iraq’s indigenous Christians has prompted that group to flee en masse. At the time of Saddam Hussein’s fall, the number of Chaldean Catholics, Assyrian Orthodox, Armenians, Syriacs, and other Christians in Iraq was estimated at 1.4 million. Half of these Christians have since fled, and some observers speculate that this may well be the last Christmas in Iraq for the half remaining. In fact, it’s not just the Christian community that faces existential threats, and it is not just in Iraq. Every one of the indigenous religious communities evoked by the nativity story is disappearing from the region’s Muslim-majority countries.
Religious demographics are kept as state secrets in the Muslim Middle East, and most of those countries’ governments have not conducted a census in decades. Still, while the data are soft, it is established that Christians are by far the largest remaining non-Muslim group, and that they are clustered principally in Egypt, Iraq, and the Levant. It is estimated that they number no more than 15 million, a minute fraction of the region’s overall population. Lebanese scholar Habib Malik writes that these Christians are in a state of “terminal regional decline.”
The majority are Egypt’s Copts, numbering between 8 and 12 million. A year ago, Coptic worshippers were massacred during a Christmas Eve attack on their church in Naga Hammadi in southern Egypt, and several Coptic villages have been targeted by pogrom-like mob violence. In recent decades, Lebanon’s Christians have seen a sharp drop in their numbers, down from the majority there to one-third of the population, about 1.5 million. Syria has about 1 million; Jordan, about 185,000. The West Bank has about 50,000, and Gaza, 1,000 to 3,000. In Turkey, the site of Constantinople, which was the center of Byzantine Christianity from the 4th to the 15th century, some 100,000 Christians remain, less than 0.2 percent of the population. Iran counts about 300,000 Christians. Not all those who have fled from Iraq have left the region. About 60,000 have found refuge in Syria, for example. However, their presence is tenuous: They are barred from working and aid from abroad is scarce; some of the women have turned to prostitution, according to the Chaldean Catholic bishop of Aleppo, Antoine Audo, SJ.
The Persian Gulf region and northern Africa have long since been “cleansed” of their indigenous Christian churches. Native Christians — mostly evangelicals, probably numbering in the thousands — worship largely in secret; Saudi Arabia has only one publicly known native Christian, the oft-imprisoned and extremely courageous Hamoud Saleh Al-Amri. Foreign workers, including over a million Christians, now living in Saudi Arabia and the Gulf are denied rights of citizenship and, in the former, even the right to have churches. Morocco summarily deported scores of foreign Christian educators and social workers last spring.
The other religions have contracted even more sharply in the Muslim Middle East. Since the establishment of the state of Israel, some of the region’s Jews voluntarily left Muslim-majority countries; but as many as 850,000 of them, such as the Jews of Baghdad sixty years ago, were driven out, forced to leave land and possessions behind, by freelance terror and government policies. The parts of Iraq, Egypt, and Yemen that had been great Jewish centers since Old Testament times now have Jewish populations numbering in single, double, and triple digits, respectively. Estimates of Morocco’s native Jewish community, now the largest in the Arab Middle East, range from 2,000 to 6,000. Iran is home to 20,000 or so Jews. Turkey has 25,000.
Zoroastrians, based on the plains of Iran since their religion’s founding somewhere between 1800 and 1500 b.c. by the devotional poet Zarathustra, are estimated to number between 45,000 and 90,000. Iran scholar Jamsheed Choksy has documented (see “Religious Cleansing in Iran,” by Nina Shea and Jamsheed K. Choksy, July 22, 2009) a “steady decline through emigration away from Iran since the Islamic Republic’s intolerance toward minorities began in 1979.” Iran’s largest non-Muslim community is the Baha’i, founded after Islam in Shiraz, in southeastern Iran, and severely repressed as a heresy; Baha’is in Iran number about 350,000. Non-Muslim communities collectively have diminished to no more than 2 percent of Iran’s 71 million people.
Yezidis, who draw upon Zoroastrian beliefs, are found in northern Iraq; hundreds of thousands of them have fled in recent years, leaving half a million still in their native land. Sabean Mandeans, mostly based in Baghdad and Basra, are down to one-tenth of their pre-2003 population of 50,000.
In past centuries, Islamic conversion by the sword and pressures under the grossly discriminatory dhimmi system took their toll on the Middle East’s “People of the Book” (Jews, Christians, and Zoroastrians). Now, factors such as lower birth rates, and emigration because of conflict at home and economic opportunities abroad, are commonly offered to explain these communities’ accelerating decline. Less plausibly, the region’s rulers, Western academics — many of whom they fund outright or otherwise provide inducements to — and religious leaders they essentially hold hostage also blame Zionism.
But the leading, and most obvious factor, one that was on full display during the Baghdad church massacre this October is rarely openly acknowledged or discussed: that is, the rise of extremist Islamist movements and the fact that most of the region’s governments finance, sympathize with, or appease them, or are too weak to keep them under control.
The fact that within the Muslim Middle East indigenous non-Muslim religious communities across the spectrum — Christians of every denomination, Jews, Zoroastrians, Sabean Mandeans, Yezidis, Baha’is — are all rapidly heading toward extinction, while Muslim sects flourish in the same areas, points to this underlying phenomenon of Islamic radicalism.
Writing on this issue, Fouad Ajami eloquently put it this way: “The Islamists are doubtless a minority in the world of Islam. But they are a determined breed. Their world is the Islamic emirate, led by self-styled ‘emirs and mujahedeen in the path of God’ and legitimized by the pursuit of the caliphate that collapsed with the end of the Ottoman Empire in 1924. These masters of terror and their foot soldiers have made it increasingly difficult to integrate the world of Islam into modernity. . . . But the borders these warriors of the faith have erected between Islam and ‘the other’ are particularly forbidding. The lands of Islam were the lands of a crossroads civilization, trading routes and mixed populations. The Islamists have waged war, and a brutally effective one it has to be conceded, against that civilizational inheritance.”
We in the free West have a duty toward these endangered communities, especially Iraq’s besieged and abandoned Christians. Donations can be made to the Catholic Chaldean Federation; St. George’s ecumenical congregation in Baghdad, led by Anglican canon Andrew White; and the Assyrian-aid organization associated with Archdeacon Emanuel Youkhana of the Assyrian Church of the East. Gazing on the crèche this Christmas, let us prayerfully reflect on what each of us can do to help.
Nina Shea
Thursday, December 23, 2010
The enduring symbol of Christmas, spanning the world’s diverse Christian cultures and the history of two millennia, is the nativity scene inspired by the gospels of Matthew and Luke. Artistically synthesizing the two gospel stories, the nativity scene is infused with profound Christian meaning and symbolism.
John the Baptist, whose own birth is linked to Jesus’s in Luke’s account, exhorts Christians to “prepare the way of the Lord,” and traditionally many do so during the Christmas season by meditating on these tender devotional scenes. One of the earliest surviving is a 5th-century bas relief from Naxos, Greece. Whether modern nativity scenes are modeled on the famous “live crèches” staged by St. Francis of Assisi in the 13th century, those painted by Renaissance artists, the Baroque Neapolitan crèches (one is displayed in the White House), or simple folk versions, they remain popular worldwide.
This year, one aspect of the nativity scene deserves special reflection. Gathered around the manger that serves as the Christ Child’s cradle are representatives of three ancient religious groups indigenous to the region: Mary and Joseph, the first Christians; the shepherds of Bethlehem, the Jewish “city of David”; and the Magi, the name for Zoroastrian priests, who followed a celestial sign from their home in the East looking for the “King of the Jews.” (Though not depicted in the nativity art, John the Baptist himself attracted many followers, some of whom never converted to Christianity and became known as Sabean Mandeans.)
These figures in the Christmas story represent the principal monotheistic religions of Middle Eastern antiquity. It would not be until six centuries later that Islam arose in the Arabian peninsula. Even today, the Christians, Jews, Zoroastrians, and a group the Zoroastrians inspired, the Yezidis, as well as the Sabean Mandeans, constitute the main non-Islamic religions in the Greater Middle East.
But this is coming to an end. Since 2004, a relentless wave of Islamist terrorist attacks targeting Iraq’s indigenous Christians has prompted that group to flee en masse. At the time of Saddam Hussein’s fall, the number of Chaldean Catholics, Assyrian Orthodox, Armenians, Syriacs, and other Christians in Iraq was estimated at 1.4 million. Half of these Christians have since fled, and some observers speculate that this may well be the last Christmas in Iraq for the half remaining. In fact, it’s not just the Christian community that faces existential threats, and it is not just in Iraq. Every one of the indigenous religious communities evoked by the nativity story is disappearing from the region’s Muslim-majority countries.
Religious demographics are kept as state secrets in the Muslim Middle East, and most of those countries’ governments have not conducted a census in decades. Still, while the data are soft, it is established that Christians are by far the largest remaining non-Muslim group, and that they are clustered principally in Egypt, Iraq, and the Levant. It is estimated that they number no more than 15 million, a minute fraction of the region’s overall population. Lebanese scholar Habib Malik writes that these Christians are in a state of “terminal regional decline.”
The majority are Egypt’s Copts, numbering between 8 and 12 million. A year ago, Coptic worshippers were massacred during a Christmas Eve attack on their church in Naga Hammadi in southern Egypt, and several Coptic villages have been targeted by pogrom-like mob violence. In recent decades, Lebanon’s Christians have seen a sharp drop in their numbers, down from the majority there to one-third of the population, about 1.5 million. Syria has about 1 million; Jordan, about 185,000. The West Bank has about 50,000, and Gaza, 1,000 to 3,000. In Turkey, the site of Constantinople, which was the center of Byzantine Christianity from the 4th to the 15th century, some 100,000 Christians remain, less than 0.2 percent of the population. Iran counts about 300,000 Christians. Not all those who have fled from Iraq have left the region. About 60,000 have found refuge in Syria, for example. However, their presence is tenuous: They are barred from working and aid from abroad is scarce; some of the women have turned to prostitution, according to the Chaldean Catholic bishop of Aleppo, Antoine Audo, SJ.
The Persian Gulf region and northern Africa have long since been “cleansed” of their indigenous Christian churches. Native Christians — mostly evangelicals, probably numbering in the thousands — worship largely in secret; Saudi Arabia has only one publicly known native Christian, the oft-imprisoned and extremely courageous Hamoud Saleh Al-Amri. Foreign workers, including over a million Christians, now living in Saudi Arabia and the Gulf are denied rights of citizenship and, in the former, even the right to have churches. Morocco summarily deported scores of foreign Christian educators and social workers last spring.
The other religions have contracted even more sharply in the Muslim Middle East. Since the establishment of the state of Israel, some of the region’s Jews voluntarily left Muslim-majority countries; but as many as 850,000 of them, such as the Jews of Baghdad sixty years ago, were driven out, forced to leave land and possessions behind, by freelance terror and government policies. The parts of Iraq, Egypt, and Yemen that had been great Jewish centers since Old Testament times now have Jewish populations numbering in single, double, and triple digits, respectively. Estimates of Morocco’s native Jewish community, now the largest in the Arab Middle East, range from 2,000 to 6,000. Iran is home to 20,000 or so Jews. Turkey has 25,000.
Zoroastrians, based on the plains of Iran since their religion’s founding somewhere between 1800 and 1500 b.c. by the devotional poet Zarathustra, are estimated to number between 45,000 and 90,000. Iran scholar Jamsheed Choksy has documented (see “Religious Cleansing in Iran,” by Nina Shea and Jamsheed K. Choksy, July 22, 2009) a “steady decline through emigration away from Iran since the Islamic Republic’s intolerance toward minorities began in 1979.” Iran’s largest non-Muslim community is the Baha’i, founded after Islam in Shiraz, in southeastern Iran, and severely repressed as a heresy; Baha’is in Iran number about 350,000. Non-Muslim communities collectively have diminished to no more than 2 percent of Iran’s 71 million people.
Yezidis, who draw upon Zoroastrian beliefs, are found in northern Iraq; hundreds of thousands of them have fled in recent years, leaving half a million still in their native land. Sabean Mandeans, mostly based in Baghdad and Basra, are down to one-tenth of their pre-2003 population of 50,000.
In past centuries, Islamic conversion by the sword and pressures under the grossly discriminatory dhimmi system took their toll on the Middle East’s “People of the Book” (Jews, Christians, and Zoroastrians). Now, factors such as lower birth rates, and emigration because of conflict at home and economic opportunities abroad, are commonly offered to explain these communities’ accelerating decline. Less plausibly, the region’s rulers, Western academics — many of whom they fund outright or otherwise provide inducements to — and religious leaders they essentially hold hostage also blame Zionism.
But the leading, and most obvious factor, one that was on full display during the Baghdad church massacre this October is rarely openly acknowledged or discussed: that is, the rise of extremist Islamist movements and the fact that most of the region’s governments finance, sympathize with, or appease them, or are too weak to keep them under control.
The fact that within the Muslim Middle East indigenous non-Muslim religious communities across the spectrum — Christians of every denomination, Jews, Zoroastrians, Sabean Mandeans, Yezidis, Baha’is — are all rapidly heading toward extinction, while Muslim sects flourish in the same areas, points to this underlying phenomenon of Islamic radicalism.
Writing on this issue, Fouad Ajami eloquently put it this way: “The Islamists are doubtless a minority in the world of Islam. But they are a determined breed. Their world is the Islamic emirate, led by self-styled ‘emirs and mujahedeen in the path of God’ and legitimized by the pursuit of the caliphate that collapsed with the end of the Ottoman Empire in 1924. These masters of terror and their foot soldiers have made it increasingly difficult to integrate the world of Islam into modernity. . . . But the borders these warriors of the faith have erected between Islam and ‘the other’ are particularly forbidding. The lands of Islam were the lands of a crossroads civilization, trading routes and mixed populations. The Islamists have waged war, and a brutally effective one it has to be conceded, against that civilizational inheritance.”
We in the free West have a duty toward these endangered communities, especially Iraq’s besieged and abandoned Christians. Donations can be made to the Catholic Chaldean Federation; St. George’s ecumenical congregation in Baghdad, led by Anglican canon Andrew White; and the Assyrian-aid organization associated with Archdeacon Emanuel Youkhana of the Assyrian Church of the East. Gazing on the crèche this Christmas, let us prayerfully reflect on what each of us can do to help.
Merry Christmas to America’s Top 1 Percent
The rich are more like Santa than Scrooge.
Deroy Murdock
Thursday, December 23, 2010
In this season of giving, the words hurled at America’s wealthiest citizens have been far from generous.
The recent debate over the Obama-GOP tax-cut compromise featured language best described as “affluphobic.”
Sen. Bernie Sanders of Vermont, a self-styled socialist, spent nearly nine hours on December 10 excoriating affluent Americans. Sanders complained to colleagues that “when the rich get richer…they say: ‘I am not rich enough. I need to be richer.’ What motivates some of these people is greed and greed and more greed.” Sanders further filibustered: “Greed is, in my view, like a sickness. It’s like an addiction. We know people on heroin. They can’t stop. They need more and more.”
Sanders wailed that the top 1 percent of taxpayers (who made more than $380,354 apiece) earned 20 percent of America’s Adjusted Gross Income (AGI) in 2008, according to IRS data analyzed by the Tax Foundation. True. They also paid 38 percent of all federal income taxes. The top 5 percent (with incomes exceeding $159,619) earned 34.7 percent of AGI and paid 58.7 percent of taxes. The top 10 percent (with incomes above $113,799) earned 45.8 percent of AGI and paid 69.9 percent of federal income taxes.
So, do these rich people pay their “fair share?” If not, should the top 10 percent finance 75 percent of income taxes? Eighty percent?
In contrast, the bottom 50 percent of taxpayers generated 12.8 percent of AGI and paid 2.7 percent of all federal income taxes.
High-income taxpayers also cough up state and local levies and often pay taxes on sales, property, capital gains, dividends, partnerships, and corporate income. Their wealth floods public coffers and flows into government programs, many targeted at low-income Americans.
So what? Generosity is a snap when tax authorities demand tribute. How do the rich behave absent government coercion?
“These people who are worth hundreds of millions of dollars,” Sanders stated on the Senate floor, “Maybe they’ve got to go back to the Bible or whatever they believe in understanding that there is virtue in sharing, in reaching out, that you can’t get it all.”
Sanders should appreciate these IRS data:
To be surgically precise, as Ryan Ellis of Americans for Tax Reform notes, an IRS review of Returns with Itemized Deductions (columns CI and CJ) indicates that in tax year 2008, Americans who earned at least $200,000 filed 3,912,225 tax returns or 9.96 percent of that year’s 39,250,369 total returns. This group deducted $72,336,640,000 in charity, or 41.83 percent of the $172,936,002,000 for such deductions that all filers claimed. In short, the top 10 percent of taxpayers paid 42 percent of all charitable deductions, worth $72 billion in 2008 alone.
To understand wealthy Americans’ “virtue in sharing,” consider The 2010 Bank of America Merrill Lynch Study of High Net Worth Philanthropy. Conducted by Indiana University’s Center on Philanthropy and released November 9, this fascinating document (recommended by the National Taxpayers Union’s Andrew Moylan) finds rich people doing what Senator Sanders asked.
This survey included 801 respondents who made at least $200,000 and/or enjoyed at least $1 million in net worth, excluding housing. The average respondent was worth $10.7 million.
Among these multi-millionaires, 98.2 percent contributed to charity, versus just 64.6 percent of the general population. The wealthy typically gave away about 8 percent of their incomes in 2009. This figure has slipped as the economy has slid. In 2007’s survey, the rich donated between 9.3 percent and 16.1 percent of income.
In 2009, 26.8 percent of Americans volunteered with charitable organizations. However, 78.7 percent of wealthy people volunteered — nearly triple the national figure. The average rich respondent volunteered 307 hours. Rather than merely write checks, the average wealthy American last year gave to charity the equivalent of 38 eight-hour shifts.
The Center on Philanthropy’s researchers valued each hour of voluntarism at $20.85. So, the average rich American’s 307 volunteer hours equaled $6,400.95.
“High net worth households play an important role in the philanthropic landscape,” the Bank of America study concluded. “They give between 65 and 70 percent of all individual giving and between 49 and 53 percent of giving from all sources, which includes giving from corporations, foundations, and both living and deceased individuals.”
Some highly wealthy individuals give enough to rename entire institutions after themselves. New York University’s Medical Center was rechristened the NYU Langone Medical Center after venture capitalist (and plumber’s son) Ken Langone donated $200 million without restrictions in 2008.
That same year, Lincoln Center’s New York State Theater was redubbed the David H. Koch Theater after the businessman and free-market activist contributed $100 million to renovate the New York Ballet’s home stage.
Nonetheless, some remain utterly unimpressed with America’s wealthy. According to Cape Cod cops and fire investigators, on November 24, an arsonist torched a $500,000 house under construction in Sandwich, Mass. On December 2, an arson attempt almost destroyed a Marston’s Mills home. At both crime scenes, someone graffitied “F— the rich.”
On December 14, Clay Duke opened fire on a Panama City, Fla., school board meeting before fatally shooting himself. His online “last testament,” linked to left-wing websites, including WikiLeaks and mediamatters.org, and echoed today’s anti-rich themes.
“I was just born poor in a country where the Wealthy manipulate, use, abuse, and economically enslave 95% of the population,” Duke wrote. “Our Masters, the Wealthy, do as they like to us.”
While most wealthy people acquire their money legally, greedy crooks like Bernie Madoff exist, alas, and should be imprisoned and impoverished. Also, capitalism should be cleansed of the bailouts, subsidies, and special favors that perversely find roofers and waitresses underwriting financiers and speculators.
But these are exceptions, not the rule. Despite today’s destructive anti-rich slogans, the data demonstrate that wealthy Americans are much less like Scrooge and much more like Santa.
Deroy Murdock
Thursday, December 23, 2010
In this season of giving, the words hurled at America’s wealthiest citizens have been far from generous.
The recent debate over the Obama-GOP tax-cut compromise featured language best described as “affluphobic.”
Sen. Bernie Sanders of Vermont, a self-styled socialist, spent nearly nine hours on December 10 excoriating affluent Americans. Sanders complained to colleagues that “when the rich get richer…they say: ‘I am not rich enough. I need to be richer.’ What motivates some of these people is greed and greed and more greed.” Sanders further filibustered: “Greed is, in my view, like a sickness. It’s like an addiction. We know people on heroin. They can’t stop. They need more and more.”
Sanders wailed that the top 1 percent of taxpayers (who made more than $380,354 apiece) earned 20 percent of America’s Adjusted Gross Income (AGI) in 2008, according to IRS data analyzed by the Tax Foundation. True. They also paid 38 percent of all federal income taxes. The top 5 percent (with incomes exceeding $159,619) earned 34.7 percent of AGI and paid 58.7 percent of taxes. The top 10 percent (with incomes above $113,799) earned 45.8 percent of AGI and paid 69.9 percent of federal income taxes.
So, do these rich people pay their “fair share?” If not, should the top 10 percent finance 75 percent of income taxes? Eighty percent?
In contrast, the bottom 50 percent of taxpayers generated 12.8 percent of AGI and paid 2.7 percent of all federal income taxes.
High-income taxpayers also cough up state and local levies and often pay taxes on sales, property, capital gains, dividends, partnerships, and corporate income. Their wealth floods public coffers and flows into government programs, many targeted at low-income Americans.
So what? Generosity is a snap when tax authorities demand tribute. How do the rich behave absent government coercion?
“These people who are worth hundreds of millions of dollars,” Sanders stated on the Senate floor, “Maybe they’ve got to go back to the Bible or whatever they believe in understanding that there is virtue in sharing, in reaching out, that you can’t get it all.”
Sanders should appreciate these IRS data:
To be surgically precise, as Ryan Ellis of Americans for Tax Reform notes, an IRS review of Returns with Itemized Deductions (columns CI and CJ) indicates that in tax year 2008, Americans who earned at least $200,000 filed 3,912,225 tax returns or 9.96 percent of that year’s 39,250,369 total returns. This group deducted $72,336,640,000 in charity, or 41.83 percent of the $172,936,002,000 for such deductions that all filers claimed. In short, the top 10 percent of taxpayers paid 42 percent of all charitable deductions, worth $72 billion in 2008 alone.
To understand wealthy Americans’ “virtue in sharing,” consider The 2010 Bank of America Merrill Lynch Study of High Net Worth Philanthropy. Conducted by Indiana University’s Center on Philanthropy and released November 9, this fascinating document (recommended by the National Taxpayers Union’s Andrew Moylan) finds rich people doing what Senator Sanders asked.
This survey included 801 respondents who made at least $200,000 and/or enjoyed at least $1 million in net worth, excluding housing. The average respondent was worth $10.7 million.
Among these multi-millionaires, 98.2 percent contributed to charity, versus just 64.6 percent of the general population. The wealthy typically gave away about 8 percent of their incomes in 2009. This figure has slipped as the economy has slid. In 2007’s survey, the rich donated between 9.3 percent and 16.1 percent of income.
In 2009, 26.8 percent of Americans volunteered with charitable organizations. However, 78.7 percent of wealthy people volunteered — nearly triple the national figure. The average rich respondent volunteered 307 hours. Rather than merely write checks, the average wealthy American last year gave to charity the equivalent of 38 eight-hour shifts.
The Center on Philanthropy’s researchers valued each hour of voluntarism at $20.85. So, the average rich American’s 307 volunteer hours equaled $6,400.95.
“High net worth households play an important role in the philanthropic landscape,” the Bank of America study concluded. “They give between 65 and 70 percent of all individual giving and between 49 and 53 percent of giving from all sources, which includes giving from corporations, foundations, and both living and deceased individuals.”
Some highly wealthy individuals give enough to rename entire institutions after themselves. New York University’s Medical Center was rechristened the NYU Langone Medical Center after venture capitalist (and plumber’s son) Ken Langone donated $200 million without restrictions in 2008.
That same year, Lincoln Center’s New York State Theater was redubbed the David H. Koch Theater after the businessman and free-market activist contributed $100 million to renovate the New York Ballet’s home stage.
Nonetheless, some remain utterly unimpressed with America’s wealthy. According to Cape Cod cops and fire investigators, on November 24, an arsonist torched a $500,000 house under construction in Sandwich, Mass. On December 2, an arson attempt almost destroyed a Marston’s Mills home. At both crime scenes, someone graffitied “F— the rich.”
On December 14, Clay Duke opened fire on a Panama City, Fla., school board meeting before fatally shooting himself. His online “last testament,” linked to left-wing websites, including WikiLeaks and mediamatters.org, and echoed today’s anti-rich themes.
“I was just born poor in a country where the Wealthy manipulate, use, abuse, and economically enslave 95% of the population,” Duke wrote. “Our Masters, the Wealthy, do as they like to us.”
While most wealthy people acquire their money legally, greedy crooks like Bernie Madoff exist, alas, and should be imprisoned and impoverished. Also, capitalism should be cleansed of the bailouts, subsidies, and special favors that perversely find roofers and waitresses underwriting financiers and speculators.
But these are exceptions, not the rule. Despite today’s destructive anti-rich slogans, the data demonstrate that wealthy Americans are much less like Scrooge and much more like Santa.
Thursday, December 23, 2010
The Obamaites’ About-Face
Like Orwell’s farm animals, we have awakened to see the new commandments on the barnyard wall.
Victor Davis Hanson
Wednesday, December 22, 2010
Californians have been experiencing ten days of the wettest, snowiest weather in recent memory. In the usually arid San Joaquin Valley, flooding is ubiquitous. The high Sierra passes are locked in snow well before the first of the year. If the United Kingdom is dealing with the irony of its elites’ recently warning of an end to snow on a now snowy island, out here our version of that embarrassment is water everywhere after Energy Secretary Chu warned us that our farms would blow away and that he could envision an end altogether of California agriculture — logically, he asserted, given that 90 percent of the annual Sierra snowpack would soon disappear.
While the state struggles with flooding and blizzards, Governor Schwarzenegger is advertising himself to the Obama administration as a possible post–Van Jones green czar, to regulate energy for the country as he has done for a now insolvent California. But then, once global warming morphed into climate change, too much rain, snow, and cold could become as symptomatic of too much man-made carbon being released as too little rain, snow, and cold once were. Start that engine, and thou shalt both burn and freeze in hell.
This week Attorney General Holder was warning about the threat of terrorism — but not terrorism in the usual liberal Timothy McVeigh, “even Christians can be terrorists” sort of gobbledy-gook. Rather, Eric Holder, as this Christmas’s new Dick Cheney, is warning about U.S. citizens who are stealthy radical Islamists. Holder fears that they wish to succeed where the would-be Times Square, subway, Portland, and Christmas airliner bombers all failed. He assumes that the terrorists among us for some reason did not read the Al-Arabiya interview, fully appreciate the Cairo speech, see the famous bow to the Saudi king, or hear of administration pressure on Israel. In short, “All religions produce terrorists,” is now followed by “But some religions produce more terrorists than others.”
So, gone for the moment at least are we “cowards” who racially stereotype, oppose the Ground Zero mosque in Neanderthal fashion, and fail to appreciate Holder’s own commitment to shutting down Guantanamo and trying KSM in a New York federal court. Much like his colleague Harold Koh (who, as an Obama State Department justice official rather than a Yale law dean, is no longer suing to put an end to waterboarding at Guantanamo, but is instead opposing those who are suing to stop Predator assassination missions), Holder in a blink of an eye went from trashing the Bush-Cheney anti-terrorism protocols to sighing that it is almost a matter of when, not if, home-grown Islamic radicals will kill lots of us. Holder’s road to Damascus is eerily reminiscent of the sudden conversion in 1938 of British intellectuals, who, as Czechoslovakia was swallowed, abruptly went from 15 years of trumpeting League of Nations pacifism to calling for British military deterrence against fascism. Unlike Holder, however, they at least explained why they had made their about-faces.
To be fair, the Obamaites are simply channeling their commander-in-chief, who spent a near decade, from 2001 to 2009, pontificating on the illegality or superfluousness of the Patriot Act, renditions, tribunals, Predators, Guantanamo, and overseas wars, and then as president embraced or even expanded all of them — with not a word of remorse that his earlier demagoguing might have done great harm both to the efficacy of the programs and to the reputations of those involved in them, as well as to his country’s image abroad. I suppose we are all Orwell’s farm animals now, mystified but quiet as we wake to see the commandments on the barnyard wall crossed out and written over#.
On the matter of taxes, two years after borrowing $2 trillion to expand, stimulate, change, and reset the economy, and after talking of limb-lopping doctors, fat-cat bankers, junketeering CEOs on their way to Vegas and the Super Bowl, and how “at a certain point you’ve made enough money,” Obama has now embraced the abhorrent Bush-era tax rates, and he is courting CEOs. The latter are apparently sitting catatonic on the sidelines, with hundreds of billions in cash but unwilling to expand operations for fear that Obama will adopt EU-like socialism. At exactly what point did the caterpillar of massive borrowing and government spending emerge from his chrysalis as the butterfly of not raising taxes on anyone in a recession and of balancing budgets? At 8, 9, or 10 percent unemployment? At 50, 48, or 43 percent approval? Or at the idea of 30, 40, 50, or 63 lost House seats?
Obama reset-button diplomacy rested on two assumptions: (1) all problems abroad either started with or failed to transcend George W. Bush; and (2) multicultural non-judgmentalism must replace neo-conservative promotion of human rights across cultures and nations. Now? I don’t think anyone argues that China, Russia, or Venezuela has become a little softer on dissidents, or that North Korea is a little more quiet, or that Iran is a little less likely to press ahead with its nuclear proliferation, or that Japan, Britain, or Eastern Europe is a little more confident in American leadership, or that Iraq is lost or Afghanistan saved because we put “our eye back on the ball.” As a result, in six months, U.S foreign policy will probably be indistinguishable from that of the second George W. Bush term — even as we continue to hear sermons about bold new multicultural initiatives delivered in Nobel-laureate rhetoric. When did “Bush did it” become “And we did too!”?
When one loses one’s faith, the aftermath can be startling. As gas hits $3 a gallon at Christmas, with fears of $4 by summer-vacation time, expect suddenly to hear of plans to tap more natural gas, build more nuclear reactors, and lift the suppression of offshore drilling — all beneath a loudly trumpeted but very thin wind and solar veneer. What are we to expect next — a few windmills fastened atop a drilling rig in the Arctic National Wildlife Refuge, some solar panels on the domes of new nuclear-power plants, a supercharger as an upgrade on the Chevy Volt?
An unrepresentative but quite influential intellectual elite — in the media, the universities, the arts, and government — is vested in Barack Obama, in his unpopular doctrinaire agenda, but even more so in the symbolism of his person. The result is paradoxical. For his political survival, Obama now accepts that his faith-based ideas about the environment, radical Islam, taxes, stimulus, the economy, national security, and foreign policy are not supported by any evidence in the real world. Yet he knows as well that the more he must become empirical, the more he must assure his flock of believers outside the farmhouse window that he still walks on four rather than two legs.
The wonder is not that politicians change as politics dictate, but that the most vehement leftism now accepts nonchalantly what it not long ago so ardently demonized. The oddity is not that Obama must back up after driving his country into a brick wall at the end of a dead-end street, but that as he backs up, turns around, and heads in the other direction, he can still be praised as if he had dematerialized and gone ahead right on through the wall.
Victor Davis Hanson
Wednesday, December 22, 2010
Californians have been experiencing ten days of the wettest, snowiest weather in recent memory. In the usually arid San Joaquin Valley, flooding is ubiquitous. The high Sierra passes are locked in snow well before the first of the year. If the United Kingdom is dealing with the irony of its elites’ recently warning of an end to snow on a now snowy island, out here our version of that embarrassment is water everywhere after Energy Secretary Chu warned us that our farms would blow away and that he could envision an end altogether of California agriculture — logically, he asserted, given that 90 percent of the annual Sierra snowpack would soon disappear.
While the state struggles with flooding and blizzards, Governor Schwarzenegger is advertising himself to the Obama administration as a possible post–Van Jones green czar, to regulate energy for the country as he has done for a now insolvent California. But then, once global warming morphed into climate change, too much rain, snow, and cold could become as symptomatic of too much man-made carbon being released as too little rain, snow, and cold once were. Start that engine, and thou shalt both burn and freeze in hell.
This week Attorney General Holder was warning about the threat of terrorism — but not terrorism in the usual liberal Timothy McVeigh, “even Christians can be terrorists” sort of gobbledy-gook. Rather, Eric Holder, as this Christmas’s new Dick Cheney, is warning about U.S. citizens who are stealthy radical Islamists. Holder fears that they wish to succeed where the would-be Times Square, subway, Portland, and Christmas airliner bombers all failed. He assumes that the terrorists among us for some reason did not read the Al-Arabiya interview, fully appreciate the Cairo speech, see the famous bow to the Saudi king, or hear of administration pressure on Israel. In short, “All religions produce terrorists,” is now followed by “But some religions produce more terrorists than others.”
So, gone for the moment at least are we “cowards” who racially stereotype, oppose the Ground Zero mosque in Neanderthal fashion, and fail to appreciate Holder’s own commitment to shutting down Guantanamo and trying KSM in a New York federal court. Much like his colleague Harold Koh (who, as an Obama State Department justice official rather than a Yale law dean, is no longer suing to put an end to waterboarding at Guantanamo, but is instead opposing those who are suing to stop Predator assassination missions), Holder in a blink of an eye went from trashing the Bush-Cheney anti-terrorism protocols to sighing that it is almost a matter of when, not if, home-grown Islamic radicals will kill lots of us. Holder’s road to Damascus is eerily reminiscent of the sudden conversion in 1938 of British intellectuals, who, as Czechoslovakia was swallowed, abruptly went from 15 years of trumpeting League of Nations pacifism to calling for British military deterrence against fascism. Unlike Holder, however, they at least explained why they had made their about-faces.
To be fair, the Obamaites are simply channeling their commander-in-chief, who spent a near decade, from 2001 to 2009, pontificating on the illegality or superfluousness of the Patriot Act, renditions, tribunals, Predators, Guantanamo, and overseas wars, and then as president embraced or even expanded all of them — with not a word of remorse that his earlier demagoguing might have done great harm both to the efficacy of the programs and to the reputations of those involved in them, as well as to his country’s image abroad. I suppose we are all Orwell’s farm animals now, mystified but quiet as we wake to see the commandments on the barnyard wall crossed out and written over#.
On the matter of taxes, two years after borrowing $2 trillion to expand, stimulate, change, and reset the economy, and after talking of limb-lopping doctors, fat-cat bankers, junketeering CEOs on their way to Vegas and the Super Bowl, and how “at a certain point you’ve made enough money,” Obama has now embraced the abhorrent Bush-era tax rates, and he is courting CEOs. The latter are apparently sitting catatonic on the sidelines, with hundreds of billions in cash but unwilling to expand operations for fear that Obama will adopt EU-like socialism. At exactly what point did the caterpillar of massive borrowing and government spending emerge from his chrysalis as the butterfly of not raising taxes on anyone in a recession and of balancing budgets? At 8, 9, or 10 percent unemployment? At 50, 48, or 43 percent approval? Or at the idea of 30, 40, 50, or 63 lost House seats?
Obama reset-button diplomacy rested on two assumptions: (1) all problems abroad either started with or failed to transcend George W. Bush; and (2) multicultural non-judgmentalism must replace neo-conservative promotion of human rights across cultures and nations. Now? I don’t think anyone argues that China, Russia, or Venezuela has become a little softer on dissidents, or that North Korea is a little more quiet, or that Iran is a little less likely to press ahead with its nuclear proliferation, or that Japan, Britain, or Eastern Europe is a little more confident in American leadership, or that Iraq is lost or Afghanistan saved because we put “our eye back on the ball.” As a result, in six months, U.S foreign policy will probably be indistinguishable from that of the second George W. Bush term — even as we continue to hear sermons about bold new multicultural initiatives delivered in Nobel-laureate rhetoric. When did “Bush did it” become “And we did too!”?
When one loses one’s faith, the aftermath can be startling. As gas hits $3 a gallon at Christmas, with fears of $4 by summer-vacation time, expect suddenly to hear of plans to tap more natural gas, build more nuclear reactors, and lift the suppression of offshore drilling — all beneath a loudly trumpeted but very thin wind and solar veneer. What are we to expect next — a few windmills fastened atop a drilling rig in the Arctic National Wildlife Refuge, some solar panels on the domes of new nuclear-power plants, a supercharger as an upgrade on the Chevy Volt?
An unrepresentative but quite influential intellectual elite — in the media, the universities, the arts, and government — is vested in Barack Obama, in his unpopular doctrinaire agenda, but even more so in the symbolism of his person. The result is paradoxical. For his political survival, Obama now accepts that his faith-based ideas about the environment, radical Islam, taxes, stimulus, the economy, national security, and foreign policy are not supported by any evidence in the real world. Yet he knows as well that the more he must become empirical, the more he must assure his flock of believers outside the farmhouse window that he still walks on four rather than two legs.
The wonder is not that politicians change as politics dictate, but that the most vehement leftism now accepts nonchalantly what it not long ago so ardently demonized. The oddity is not that Obama must back up after driving his country into a brick wall at the end of a dead-end street, but that as he backs up, turns around, and heads in the other direction, he can still be praised as if he had dematerialized and gone ahead right on through the wall.
Internet Access Is Not a Civil Right
Net neturality is the Obamacare of the Web.
Michelle Malkin
Wednesday, December 22, 2010
When bureaucrats talk about increasing our “access” to X, Y, or Z, what they’re really talking about is increasing their control over our lives. As it is with the government health-care takeover, so it is with the newly approved government plan to “increase” Internet “access.” Call it Webcare.
By a vote of 3–2, the Federal Communications Commission on Tuesday adopted a controversial scheme to ensure “net neutrality” by turning unaccountable Democratic appointees into meddling online traffic cops. The panel will devise convoluted rules governing Internet-service providers, bandwidth use, content, prices, and even disclosure details on Internet speeds. The “neutrality” is brazenly undermined by preferential treatment toward wireless broadband networks. Moreover, the FCC’s scheme is widely opposed by Congress — and has already been rejected once in the courts. Demonized industry critics have warned that the regulations will stifle innovation and result in less access, not more.
Sound familiar? The parallels with health care are striking. The architects of Obamacare promised to provide Americans more access to health insurance — and cast their agenda as a fundamental universal entitlement.
In fact, it was a pretext for creating a gargantuan federal bureaucracy with the power to tax, redistribute, and regulate the private health-insurance market to death — and replace it with a centrally planned government system overseen by politically driven code enforcers dictating everything from annual coverage limits to administrative expenditures to the makeup of the medical workforce. The costly, onerous, and selectively applied law has resulted in less access, not more.
Undaunted, promoters of Obama FCC chairman Julius Genachowski’s “open Internet” plan have couched their online power grab in the rhetoric of civil rights. On Monday, FCC commissioner Michael Copps proclaimed: “Universal access to broadband needs to be seen as a civil right . . . [though] not many people have talked about it that way.” Opposing the government Internet takeover blueprint, in other words, is tantamount to supporting segregation. Cunning propaganda, that.
“Broadband is becoming a basic necessity,” civil-rights activist Benjamin Hooks added. And earlier this month, fellow FCC panelist Mignon Clyburn, daughter of Congressional Black Caucus leader and number three House Democrat James Clyburn of South Carolina, declared that free (read: taxpayer-subsidized) access to the Internet is not only a civil right for every “nappy-headed child” in America, but is essential to their self-esteem. Every minority child, she said, “deserves to be not only connected, but to be proud of who he or she is.”
Calling them “nappy-headed” is a rather questionable way of boosting their pride, but never mind that.
Face it: A high-speed connection is no more an essential civil right than 3G cell-phone service or a Netflix account. Increasing competition and restoring academic excellence in abysmal public schools is far more of an imperative to minority children than handing them iPads. Once again, Democrats are using children as human shields to provide useful cover for not-so-noble political goals.
The “net neutrality” mob — funded by billionaire George Soros and other left-wing think tanks and nonprofits — has openly advertised its radical, speech-squelching agenda in its crusade for “media justice.” Social justice is the redistribution of wealth and economic “rights.” Media justice is the redistribution of free speech and other First Amendment rights.
The meetings of the universal-broadband set are littered with Marxist-tinged rants about “disenfranchisement” and “empowerment.” They’ve targeted conservative opponents on talk radio, cable TV, and the Internet as purveyors of “hate” who need to be managed or censored. Democratic FCC panelists have dutifully echoed their concerns about concentration of corporate media power.
As the Ford Foundation–funded Media Justice Fund, which lobbied for universal broadband, put it: This is a movement “grounded in the belief that social and economic justice will not be realized without the equitable redistribution and control of media and communication technologies.”
For progressives who cloak their ambitions in the mantle of “fairness,” it’s all about control. It’s always about control.
Michelle Malkin
Wednesday, December 22, 2010
When bureaucrats talk about increasing our “access” to X, Y, or Z, what they’re really talking about is increasing their control over our lives. As it is with the government health-care takeover, so it is with the newly approved government plan to “increase” Internet “access.” Call it Webcare.
By a vote of 3–2, the Federal Communications Commission on Tuesday adopted a controversial scheme to ensure “net neutrality” by turning unaccountable Democratic appointees into meddling online traffic cops. The panel will devise convoluted rules governing Internet-service providers, bandwidth use, content, prices, and even disclosure details on Internet speeds. The “neutrality” is brazenly undermined by preferential treatment toward wireless broadband networks. Moreover, the FCC’s scheme is widely opposed by Congress — and has already been rejected once in the courts. Demonized industry critics have warned that the regulations will stifle innovation and result in less access, not more.
Sound familiar? The parallels with health care are striking. The architects of Obamacare promised to provide Americans more access to health insurance — and cast their agenda as a fundamental universal entitlement.
In fact, it was a pretext for creating a gargantuan federal bureaucracy with the power to tax, redistribute, and regulate the private health-insurance market to death — and replace it with a centrally planned government system overseen by politically driven code enforcers dictating everything from annual coverage limits to administrative expenditures to the makeup of the medical workforce. The costly, onerous, and selectively applied law has resulted in less access, not more.
Undaunted, promoters of Obama FCC chairman Julius Genachowski’s “open Internet” plan have couched their online power grab in the rhetoric of civil rights. On Monday, FCC commissioner Michael Copps proclaimed: “Universal access to broadband needs to be seen as a civil right . . . [though] not many people have talked about it that way.” Opposing the government Internet takeover blueprint, in other words, is tantamount to supporting segregation. Cunning propaganda, that.
“Broadband is becoming a basic necessity,” civil-rights activist Benjamin Hooks added. And earlier this month, fellow FCC panelist Mignon Clyburn, daughter of Congressional Black Caucus leader and number three House Democrat James Clyburn of South Carolina, declared that free (read: taxpayer-subsidized) access to the Internet is not only a civil right for every “nappy-headed child” in America, but is essential to their self-esteem. Every minority child, she said, “deserves to be not only connected, but to be proud of who he or she is.”
Calling them “nappy-headed” is a rather questionable way of boosting their pride, but never mind that.
Face it: A high-speed connection is no more an essential civil right than 3G cell-phone service or a Netflix account. Increasing competition and restoring academic excellence in abysmal public schools is far more of an imperative to minority children than handing them iPads. Once again, Democrats are using children as human shields to provide useful cover for not-so-noble political goals.
The “net neutrality” mob — funded by billionaire George Soros and other left-wing think tanks and nonprofits — has openly advertised its radical, speech-squelching agenda in its crusade for “media justice.” Social justice is the redistribution of wealth and economic “rights.” Media justice is the redistribution of free speech and other First Amendment rights.
The meetings of the universal-broadband set are littered with Marxist-tinged rants about “disenfranchisement” and “empowerment.” They’ve targeted conservative opponents on talk radio, cable TV, and the Internet as purveyors of “hate” who need to be managed or censored. Democratic FCC panelists have dutifully echoed their concerns about concentration of corporate media power.
As the Ford Foundation–funded Media Justice Fund, which lobbied for universal broadband, put it: This is a movement “grounded in the belief that social and economic justice will not be realized without the equitable redistribution and control of media and communication technologies.”
For progressives who cloak their ambitions in the mantle of “fairness,” it’s all about control. It’s always about control.
Obama, Pragmatic Socialist
A response to Slate’s David Weigel.
Stanley Kurtz
Tuesday, December 21, 2010
‘Is the president still a Marxist if he cuts taxes?” That’s the question posed by Slate’s David Weigel in his critical review of my new book, Radical-in-Chief: Barack Obama and the Untold Story of American Socialism.
I’ll answer with a question of my own: Is Lawrence O’Donnell still a socialist? O’Donnell confessed his socialism last month in the course of arguing that, in the face of electoral pressure, intelligent radicalism puts on a moderate face. This week, he took Alan Grayson to the woodshed for opposing the president’s tax deal. That looks like intelligent radicalism to me, not a retreat from socialism.
Weigel has approached Radical-in-Chief with a skeptical eye, but also with an appreciation for the quality of its reporting. He put the book at the top of his Christmas gift list, and his review is thoughtful and respectful. He agrees that Obama has “palled around with socialists more than [he] has admitted,” and that Obama’s campaign “fact-checking site,” FightTheSmears, was wrong to dismiss various reports of Obama’s radical ties. (“Alinskyite character-assassination site” would be a more accurate term for FightTheSmears.)
Weigel quickly moves past these concessions, as though they have nothing to do with his larger argument. Yet the fact that Obama has consistently disguised, suppressed, and lied about his socialist ties has everything to do with the question of whether the president actually is a socialist. In view of the many revelations in Radical-in-Chief, Obama’s credibility on the matter of his radical past is just about nil. The president is clearly hiding a great deal.
That stealth creates a presumption that the past reveals something of significance about the present. Obama could easily have confessed his youthful radicalism and explained how he’d come to abandon it. As a man with national ambitions, we might have expected him to do so. Instead, Obama has consistently identified himself with a profession, community organizing, he knew to be quietly socialist.
At various points, Weigel seems reluctant to plainly state my argument. The socialist conferences Obama attended were filled with talk of community organizing as the key to socialist strategy. Every socialist faction at those conferences was entranced with the idea that African-American political leaders could emerge out of socialist-run community organizations and move the country to the left. You needn’t be certain of Obama’s behavior at those conferences to see how profoundly they could have shaped him.
Also,Weigel mangles my account of the subprime crisis pretty badly. I don’t call ACORN solely responsible for the debacle, but I do point to its underappreciated role in laying the foundations of the crisis. I focus less on what Cloward and Piven did in welfare offices in the Sixties than on what socialists such as Peter Dreier (an Obama adviser in 2008) did with ACORN during the banking campaign itself.
Most of all, Weigel gets me wrong on Obama’s presidency. I make a full-spectrum case, and it doesn’t depend on seeing a Soviet-style hammer and sickle everywhere. A great deal of Radical-in-Chief is devoted to showing how American socialism has changed since the Sixties. The “democratic socialism” that stands behind community organizing is not a carbon-copy of Soviet-dominated Marxism. On the other hand, Obama’s socialist colleagues and mentors did retain a soft spot for Cuba, Nicaragua, and other Third World Marxist regimes. The book grapples with that complexity in both the past and the present. I use socialism not as a scare word, but rather to illuminate Obama-administration policy across the board. Weigel doesn’t begin to deal with my actual analysis of the Obama administration.
Do I see socialism as “the only way” to explain Obama’s determination to press the health-care issue during his first year? Not at all. I do think it’s the best way to explain it, however. Obama overrode the advice of his key political advisers on that issue, so something more than liberal-Democratic business-as-usual was likely at work.
Yet my argument rests not on any one decision, but on the total picture: from Obama’s suppression of his socialist past, to his lifetime spent in a socialist political world, to the connections between Obama’s socialist past and his administration’s present. Once you understand that Obama embraced the stealthy, pragmatic, and gradualist socialism of his mentors, my explanation makes a lot more sense than Weigel’s glib claim that Obama is just a “liberal political hack” who fooled Chicago’s socialists into thinking he was one of them. That story can’t stand up to the facts of Obama’s life.
As I show in Radical-in-Chief, the young Obama was no hack, but a committed, hard-core Marxist. He didn’t fool his organizing mentors into believing that he shared their views; rather, those mentors taught him how to do socialism the pragmatic way, the sellable way, the Lawrence O’Donnell way.
At the time, these organizers were slammed by their less secretive socialist pals for their gradualism and stealth. So if a chunk of America’s Left insists on slamming Obama the same way — as Alan Grayson naïvely does — neither O’Donnell nor Obama are a whit less socialist for it.
Stanley Kurtz
Tuesday, December 21, 2010
‘Is the president still a Marxist if he cuts taxes?” That’s the question posed by Slate’s David Weigel in his critical review of my new book, Radical-in-Chief: Barack Obama and the Untold Story of American Socialism.
I’ll answer with a question of my own: Is Lawrence O’Donnell still a socialist? O’Donnell confessed his socialism last month in the course of arguing that, in the face of electoral pressure, intelligent radicalism puts on a moderate face. This week, he took Alan Grayson to the woodshed for opposing the president’s tax deal. That looks like intelligent radicalism to me, not a retreat from socialism.
Weigel has approached Radical-in-Chief with a skeptical eye, but also with an appreciation for the quality of its reporting. He put the book at the top of his Christmas gift list, and his review is thoughtful and respectful. He agrees that Obama has “palled around with socialists more than [he] has admitted,” and that Obama’s campaign “fact-checking site,” FightTheSmears, was wrong to dismiss various reports of Obama’s radical ties. (“Alinskyite character-assassination site” would be a more accurate term for FightTheSmears.)
Weigel quickly moves past these concessions, as though they have nothing to do with his larger argument. Yet the fact that Obama has consistently disguised, suppressed, and lied about his socialist ties has everything to do with the question of whether the president actually is a socialist. In view of the many revelations in Radical-in-Chief, Obama’s credibility on the matter of his radical past is just about nil. The president is clearly hiding a great deal.
That stealth creates a presumption that the past reveals something of significance about the present. Obama could easily have confessed his youthful radicalism and explained how he’d come to abandon it. As a man with national ambitions, we might have expected him to do so. Instead, Obama has consistently identified himself with a profession, community organizing, he knew to be quietly socialist.
At various points, Weigel seems reluctant to plainly state my argument. The socialist conferences Obama attended were filled with talk of community organizing as the key to socialist strategy. Every socialist faction at those conferences was entranced with the idea that African-American political leaders could emerge out of socialist-run community organizations and move the country to the left. You needn’t be certain of Obama’s behavior at those conferences to see how profoundly they could have shaped him.
Also,Weigel mangles my account of the subprime crisis pretty badly. I don’t call ACORN solely responsible for the debacle, but I do point to its underappreciated role in laying the foundations of the crisis. I focus less on what Cloward and Piven did in welfare offices in the Sixties than on what socialists such as Peter Dreier (an Obama adviser in 2008) did with ACORN during the banking campaign itself.
Most of all, Weigel gets me wrong on Obama’s presidency. I make a full-spectrum case, and it doesn’t depend on seeing a Soviet-style hammer and sickle everywhere. A great deal of Radical-in-Chief is devoted to showing how American socialism has changed since the Sixties. The “democratic socialism” that stands behind community organizing is not a carbon-copy of Soviet-dominated Marxism. On the other hand, Obama’s socialist colleagues and mentors did retain a soft spot for Cuba, Nicaragua, and other Third World Marxist regimes. The book grapples with that complexity in both the past and the present. I use socialism not as a scare word, but rather to illuminate Obama-administration policy across the board. Weigel doesn’t begin to deal with my actual analysis of the Obama administration.
Do I see socialism as “the only way” to explain Obama’s determination to press the health-care issue during his first year? Not at all. I do think it’s the best way to explain it, however. Obama overrode the advice of his key political advisers on that issue, so something more than liberal-Democratic business-as-usual was likely at work.
Yet my argument rests not on any one decision, but on the total picture: from Obama’s suppression of his socialist past, to his lifetime spent in a socialist political world, to the connections between Obama’s socialist past and his administration’s present. Once you understand that Obama embraced the stealthy, pragmatic, and gradualist socialism of his mentors, my explanation makes a lot more sense than Weigel’s glib claim that Obama is just a “liberal political hack” who fooled Chicago’s socialists into thinking he was one of them. That story can’t stand up to the facts of Obama’s life.
As I show in Radical-in-Chief, the young Obama was no hack, but a committed, hard-core Marxist. He didn’t fool his organizing mentors into believing that he shared their views; rather, those mentors taught him how to do socialism the pragmatic way, the sellable way, the Lawrence O’Donnell way.
At the time, these organizers were slammed by their less secretive socialist pals for their gradualism and stealth. So if a chunk of America’s Left insists on slamming Obama the same way — as Alan Grayson naïvely does — neither O’Donnell nor Obama are a whit less socialist for it.
Understanding the Left's Intolerance for Intolerance
By David Limbaugh
Tuesday, December 21, 2010
It's that time of year when we are reminded just how indebted we are to the left's mega-tolerant cultural warriors. Annually, they jolt us out of our complacency to notice how imposing, intolerant and dangerous Christmas and Christianity are.
If it weren't for these valiant soldiers, this disturbing proliferation of Christmas celebrations and other Christian symbols would proceed unabated.
Each year, the examples are too voluminous to document exhaustively, but permit me to share a few highlights, which will enhance your appreciation for the sheer magnitude of the effort being undertaken by these selfless watchdogs committed to liberating our culture(s) from the oppressive chains of Christmas and Christianity. The noble work of these secular saints is global in scope because the threat they confront recognizes no geographic or national boundaries.
Researchers at Simon Fraser University in Canada have discovered that "non-Christians feel less self-assured and have fewer positive feelings if a Christmas tree (is) in the room." (Meanwhile, American sociologists have concluded that tea partyers and members of MoveOn.org experience discomfort in each other's presence.) Finally, academics are making judicious use of their precious time and resources to address compelling societal issues.
One of Simon Fraser University's researchers, Michael Schmitt, was motivated to look into the matter concerning "controversy over whether Christmas should be celebrated in public in case it offends non-Christians." Don't let it escape your notice that this example reveals that these researchers were not limiting their scrutiny to actions by any government. As dedicated soldiers, they obviously understand that their reach has to transcend constitutional questions and permeate private relationships around the globe. Bless their wisdom.
Schmitt concluded that the presence of Christmas trees -- or, by extension, other Christmas or Christian symbols -- "might feel threatening to people." They militate against "a more multicultural or inclusive society." Duh. Of course Christmas trees are threatening and noninclusive. We all know how unloving and intolerant the Gospel accounts of Christ's sacrificial death on the cross are, let alone the other New Testament writings detailing Christ's offer of unmerited redemption.
Back in the bigoted land of the United States, NPR's zealously patriotic and scrupulously tolerant liberal reporter Nina Totenberg committed an uncharacteristic yet unforgivable verbal infraction by incidentally alluding to "Christmas" in the context of her commentary. She said: "Well, these agencies, including the Defense Department, don't know how much money they've got and for what. And I was at -- forgive the expression -- a Christmas party at the Department of Justice, and people were actually really worried about this."
Excuse me? How can she expect us to forgive her for that, notwithstanding her lifetime record of devotion to worthy liberal causes? We simply must enforce a no-tolerance policy for such insensitivity, lest we risk making a small minority uncomfortable by mentioning an objectively offensive holiday instead of properly alienating the callous majority by dissing a celebration about their Savior.
Also this side of the border, federal bank regulators ordered the removal of a daily Bible verse from the bank's website, crosses from the teller counter and buttons from public display saying "Merry Christmas, God With Us." The gallant bureaucrats rightly understood these symbols "could be offensive." They warned that the bank's failure to comply would result in referral to the Justice Department for enforcement action. (At the time of the story, it could not be determined whether the referral could be made in time for Eric Holder's deputies to review it at Justice's own Christmas party -- the one insensitively referred to by Totenberg.)
Under what authority did the shrewd regulators purport to act? Not to worry; there's always a federal regulation to fall back on when the intolerant rubes among us need policing.
Indeed, such was the case here. "Specifically, the feds believed, these symbols violated the discouragement clause of Regulation B of the bank regulations," KOCO reported. "According to the clause, '...the use of words, symbols, models and other forms of communication ... express, imply or suggest a discriminatory preference or policy of exclusion."
Yeah, that makes sense, and I'm a little embarrassed I needed to read the reg myself to be reminded. Of course the presence of Christmas symbols and the loving images they convey shout exclusion and discrimination. I'm a bit red-faced that we have to rely on government saints to perform such good works that ought to be ushering forth from private citizens and businesses as a matter of self-help.
That said, I applaud the regulators for tending to this urgent business when they could have easily allowed themselves to be distracted by less pressing problems.
Perhaps the next time you hear some conservatives complaining about these mythical assaults on Christmas and Christianity, you'll understand the back story and give thanks.
Tuesday, December 21, 2010
It's that time of year when we are reminded just how indebted we are to the left's mega-tolerant cultural warriors. Annually, they jolt us out of our complacency to notice how imposing, intolerant and dangerous Christmas and Christianity are.
If it weren't for these valiant soldiers, this disturbing proliferation of Christmas celebrations and other Christian symbols would proceed unabated.
Each year, the examples are too voluminous to document exhaustively, but permit me to share a few highlights, which will enhance your appreciation for the sheer magnitude of the effort being undertaken by these selfless watchdogs committed to liberating our culture(s) from the oppressive chains of Christmas and Christianity. The noble work of these secular saints is global in scope because the threat they confront recognizes no geographic or national boundaries.
Researchers at Simon Fraser University in Canada have discovered that "non-Christians feel less self-assured and have fewer positive feelings if a Christmas tree (is) in the room." (Meanwhile, American sociologists have concluded that tea partyers and members of MoveOn.org experience discomfort in each other's presence.) Finally, academics are making judicious use of their precious time and resources to address compelling societal issues.
One of Simon Fraser University's researchers, Michael Schmitt, was motivated to look into the matter concerning "controversy over whether Christmas should be celebrated in public in case it offends non-Christians." Don't let it escape your notice that this example reveals that these researchers were not limiting their scrutiny to actions by any government. As dedicated soldiers, they obviously understand that their reach has to transcend constitutional questions and permeate private relationships around the globe. Bless their wisdom.
Schmitt concluded that the presence of Christmas trees -- or, by extension, other Christmas or Christian symbols -- "might feel threatening to people." They militate against "a more multicultural or inclusive society." Duh. Of course Christmas trees are threatening and noninclusive. We all know how unloving and intolerant the Gospel accounts of Christ's sacrificial death on the cross are, let alone the other New Testament writings detailing Christ's offer of unmerited redemption.
Back in the bigoted land of the United States, NPR's zealously patriotic and scrupulously tolerant liberal reporter Nina Totenberg committed an uncharacteristic yet unforgivable verbal infraction by incidentally alluding to "Christmas" in the context of her commentary. She said: "Well, these agencies, including the Defense Department, don't know how much money they've got and for what. And I was at -- forgive the expression -- a Christmas party at the Department of Justice, and people were actually really worried about this."
Excuse me? How can she expect us to forgive her for that, notwithstanding her lifetime record of devotion to worthy liberal causes? We simply must enforce a no-tolerance policy for such insensitivity, lest we risk making a small minority uncomfortable by mentioning an objectively offensive holiday instead of properly alienating the callous majority by dissing a celebration about their Savior.
Also this side of the border, federal bank regulators ordered the removal of a daily Bible verse from the bank's website, crosses from the teller counter and buttons from public display saying "Merry Christmas, God With Us." The gallant bureaucrats rightly understood these symbols "could be offensive." They warned that the bank's failure to comply would result in referral to the Justice Department for enforcement action. (At the time of the story, it could not be determined whether the referral could be made in time for Eric Holder's deputies to review it at Justice's own Christmas party -- the one insensitively referred to by Totenberg.)
Under what authority did the shrewd regulators purport to act? Not to worry; there's always a federal regulation to fall back on when the intolerant rubes among us need policing.
Indeed, such was the case here. "Specifically, the feds believed, these symbols violated the discouragement clause of Regulation B of the bank regulations," KOCO reported. "According to the clause, '...the use of words, symbols, models and other forms of communication ... express, imply or suggest a discriminatory preference or policy of exclusion."
Yeah, that makes sense, and I'm a little embarrassed I needed to read the reg myself to be reminded. Of course the presence of Christmas symbols and the loving images they convey shout exclusion and discrimination. I'm a bit red-faced that we have to rely on government saints to perform such good works that ought to be ushering forth from private citizens and businesses as a matter of self-help.
That said, I applaud the regulators for tending to this urgent business when they could have easily allowed themselves to be distracted by less pressing problems.
Perhaps the next time you hear some conservatives complaining about these mythical assaults on Christmas and Christianity, you'll understand the back story and give thanks.
Tuesday, December 21, 2010
The Age of Uncertainty
Our entrepreneurs have lost confidence in the federal government.
Michael G. Franc
Tuesday, December 21, 2010
Entrepreneurs fret daily over economic uncertainty. Case in point: Even with passage of the lame-duck tax deal, they still don’t know what their tax burden will be two years from now.
Approval of that deal lifted what the Wall Street Journal dubs the “world of the temporary tax code” to unprecedented heights. The Journal explains:
One bitter fruit of all this uncertainty can be gleaned from a recent Federal Reserve study. The Fed calculated that skittish companies are now sitting on nearly $2 trillion of cash reserves rather than deploying those resources to expand payrolls, build new plants, or purchase new equipment. This is not only $130 billion higher than it was at the end of June but, as a percentage of total assets, the highest cash-reserve level in over half a century.
Two recent court decisions exemplify the full extent of the uncertainty created by the current administration. On December 10, the U.S. Court of Appeals for the District of Columbia rejected a plea from the nation’s manufacturers to scuttle a regulatory initiative by the Environmental Protection Agency that would subject them to new regulatory burdens in a quixotic effort to reduce carbon-dioxide emissions. “The EPA’s agenda,” the National Association of Manufacturers said in a statement, “places unnecessary burdens on manufacturers, drives up energy costs and imposes even more uncertainty on the nation’s job creators.”
The second court decision concerned the new health-care law. A federal judge in Virginia ruled that a crucial provision in Obamacare — the mandate that individuals purchase governmentally approved health insurance — violates the Constitution. “An individual’s personal decision to [purchase or decline to purchase] health insurance from a private provider,” District Court Judge Henry Hudson wrote, “is beyond the historical reach of the U.S. Constitution.” The fate of the policy now depends on the Supreme Court.
And regardless of the ultimate fate of the mandate, the statuary language of Obamacare bestows unprecedented discretionary power upon the federal bureaucrats charged with its implementation. John Hoff, a former assistant secretary at the Department of Health and Human Services, explained:
At this stage, business executives or ordinary citizens trying to comprehend the implications of the new law might as well flip a coin or hire a fortune teller. How will the bureaucrats interpret this “aspirational” language? Will Judge Hudson’s decision be upheld on appeal? And what consequences will flow from all these unknowns? Will insurance rates skyrocket, encouraging consumers to forgo coverage until they need it, which in turn will cause insurers to increase premiums further in a never-ending insurance death spiral? Should employers maintain their current health plans under the law’s “grandfather” clause, or just dump their employees into the new health exchanges where the cost of coverage might be prohibitive? And if the cost of coverage skyrockets, what about all those rosy projections of manageable subsidy costs from the Congressional Budget Office? Those dollar figures might jump by a few hundred billion — or more. Where will that money come from?
Then there’s the president’s on-again, off-again offshore-drilling policy. And what about employers who may face stacked union elections if the Department of Labor opts to circumvent Congress and implement card check (the number one item on Big Labor’s wish list) administratively?
This layered uncertainty looms as a Sword of Damocles over every business, every investor, and every head of household in America. It has suffocated the risk-taking, entrepreneurial spirit that has made America the exceptional nation in human history. For entrepreneurship hinges on intelligent risk-taking, not closing your eyes and plunging headfirst off the foggy cliff of government intervention and manipulation.
Our current and ongoing economic malaise arises from something we have not seen in America since the days of FDR’s failed New Deal. Our entrepreneurs — society’s economic risk-takers — have lost confidence — $2 trillion worth of confidence — in the federal government’s willingness to let them operate in a way that makes economic sense.
This is why the recent tax deal ultimately does nothing to improve our long-term economic outlook. True, the compromise was better than one potential option: a catastrophic increase in the tax burden that would have destroyed jobs, businesses, and lives. But the goals of this legislative exercise should have been more ambitious: One, create breathing room for entrepreneurs, families, and investors in the form of a reasonable tax and regulatory burden; and two, guarantee that Congress will remain faithful to these policies for the long haul. This would give our most productive citizens the confidence that if they make an investment that requires a long time horizon, they can count on a stable policy environment.
This would mean, among other things, a permanent extension of the Bush-era tax rates for everyone, putting an end to the most egregious regulatory initiatives now underway, consigning Obamacare to the dustbin of history, and allowing energy companies to identify, recover, and generate as much domestic energy as possible.
Our wealth creators will reengage with our free-enterprise system only when these good policies are in place and stable. And we will know we have succeeded only when investors and businesses move that sidelined $2 trillion into new plants, equipment, and jobs.
Michael G. Franc
Tuesday, December 21, 2010
Entrepreneurs fret daily over economic uncertainty. Case in point: Even with passage of the lame-duck tax deal, they still don’t know what their tax burden will be two years from now.
Approval of that deal lifted what the Wall Street Journal dubs the “world of the temporary tax code” to unprecedented heights. The Journal explains:
The U.S. will have no permanent regime governing levies on salaries, capital gains and dividends, the Social Security tax, as well as a slew of targeted breaks for families, students and other groups. This on top of dozens of corporate-tax provisions that already were subject to annual renewal.All this uncertainty “complicates planning and discourages hiring and investment” because “businesses tend to be more reluctant to invest when they perceive high levels of uncertainty about various things, including taxes.”
One bitter fruit of all this uncertainty can be gleaned from a recent Federal Reserve study. The Fed calculated that skittish companies are now sitting on nearly $2 trillion of cash reserves rather than deploying those resources to expand payrolls, build new plants, or purchase new equipment. This is not only $130 billion higher than it was at the end of June but, as a percentage of total assets, the highest cash-reserve level in over half a century.
Two recent court decisions exemplify the full extent of the uncertainty created by the current administration. On December 10, the U.S. Court of Appeals for the District of Columbia rejected a plea from the nation’s manufacturers to scuttle a regulatory initiative by the Environmental Protection Agency that would subject them to new regulatory burdens in a quixotic effort to reduce carbon-dioxide emissions. “The EPA’s agenda,” the National Association of Manufacturers said in a statement, “places unnecessary burdens on manufacturers, drives up energy costs and imposes even more uncertainty on the nation’s job creators.”
The second court decision concerned the new health-care law. A federal judge in Virginia ruled that a crucial provision in Obamacare — the mandate that individuals purchase governmentally approved health insurance — violates the Constitution. “An individual’s personal decision to [purchase or decline to purchase] health insurance from a private provider,” District Court Judge Henry Hudson wrote, “is beyond the historical reach of the U.S. Constitution.” The fate of the policy now depends on the Supreme Court.
And regardless of the ultimate fate of the mandate, the statuary language of Obamacare bestows unprecedented discretionary power upon the federal bureaucrats charged with its implementation. John Hoff, a former assistant secretary at the Department of Health and Human Services, explained:
While it is detailed in some instances, [the new health law] is largely aspirational; it directs the Administration to achieve various universally desired goals — better quality of health care, improved access to care, and increased efficiency of delivery. It constructs the scaffolding of federal control and gives the Administration very broad authority to achieve these aspirations. Each of the many actions taken to implement it will determine the shape of that control. Implementation will be technically difficult and politically charged.This high degree of bureaucratic discretion, it is important to point out, affects the entire health-care sector, which now constitutes fully one-sixth of the economy.
At this stage, business executives or ordinary citizens trying to comprehend the implications of the new law might as well flip a coin or hire a fortune teller. How will the bureaucrats interpret this “aspirational” language? Will Judge Hudson’s decision be upheld on appeal? And what consequences will flow from all these unknowns? Will insurance rates skyrocket, encouraging consumers to forgo coverage until they need it, which in turn will cause insurers to increase premiums further in a never-ending insurance death spiral? Should employers maintain their current health plans under the law’s “grandfather” clause, or just dump their employees into the new health exchanges where the cost of coverage might be prohibitive? And if the cost of coverage skyrockets, what about all those rosy projections of manageable subsidy costs from the Congressional Budget Office? Those dollar figures might jump by a few hundred billion — or more. Where will that money come from?
Then there’s the president’s on-again, off-again offshore-drilling policy. And what about employers who may face stacked union elections if the Department of Labor opts to circumvent Congress and implement card check (the number one item on Big Labor’s wish list) administratively?
This layered uncertainty looms as a Sword of Damocles over every business, every investor, and every head of household in America. It has suffocated the risk-taking, entrepreneurial spirit that has made America the exceptional nation in human history. For entrepreneurship hinges on intelligent risk-taking, not closing your eyes and plunging headfirst off the foggy cliff of government intervention and manipulation.
Our current and ongoing economic malaise arises from something we have not seen in America since the days of FDR’s failed New Deal. Our entrepreneurs — society’s economic risk-takers — have lost confidence — $2 trillion worth of confidence — in the federal government’s willingness to let them operate in a way that makes economic sense.
This is why the recent tax deal ultimately does nothing to improve our long-term economic outlook. True, the compromise was better than one potential option: a catastrophic increase in the tax burden that would have destroyed jobs, businesses, and lives. But the goals of this legislative exercise should have been more ambitious: One, create breathing room for entrepreneurs, families, and investors in the form of a reasonable tax and regulatory burden; and two, guarantee that Congress will remain faithful to these policies for the long haul. This would give our most productive citizens the confidence that if they make an investment that requires a long time horizon, they can count on a stable policy environment.
This would mean, among other things, a permanent extension of the Bush-era tax rates for everyone, putting an end to the most egregious regulatory initiatives now underway, consigning Obamacare to the dustbin of history, and allowing energy companies to identify, recover, and generate as much domestic energy as possible.
Our wealth creators will reengage with our free-enterprise system only when these good policies are in place and stable. And we will know we have succeeded only when investors and businesses move that sidelined $2 trillion into new plants, equipment, and jobs.
Labels:
Capitalism,
Economy,
Hypocrisy,
Ignorance,
Liberals,
Obama,
Policy,
Recommended Reading
Obama’s Mitterrand Moment
The president could learn from the French prime minister’s U-turn.
Rich Lowry
Tuesday, December 21, 2010
In 1981, Francois Mitterrand swept to power in France in a watershed election. He united the Left and fired the imagination of the country’s youth, who danced in the streets on election night in a frenzy of revolutionary anticipation.
Mitterrand embarked on a stimulus program that would have satisfied Paul Krugman. He increased the wages of government workers and hired more of them. He boosted the minimum wage and reduced working hours. He tripled the budget deficit. In a year, he nationalized no fewer than 36 banks, along with the country’s largest industrial corporations.
The late historian Tony Judt wrote in his book Postwar that the nationalizations were meant “to symbolize the anti-capitalist intent of the new regime; to confirm that the elections of 1981 had really changed something more than just the personnel of government.” This was “change we could believe in,” taken to Gallic extremes.
Then, the unraveling. With inflation and unemployment at double digits, with the business community terrified, and with currency and people fleeing the country, Mitterrand’s “revolution” foundered on the shoals of economic and social reality. As a matter of sheer survival, he announced a “U-turn” and embraced a program of austerity, or la rigueur, reversing course on nearly everything.
Pres. Barack Obama’s “U-turn” is upon us. It is much more muted. He wasn’t as explicitly left-wing in his campaign or in his initial burst of activism as Mitterrand, and he’ll never go as far in his reversal as the flamboyantly cynical Frenchman. There’s nonetheless a whiff of Mitterrand in the air when Obama marks the extension of all the Bush tax cuts at a White House signing ceremony with Senate Minority Leader Mitch McConnell present, but not House Speaker Nancy Pelosi.
Like Mitterrand’s supporters, Obama’s boosters overinterpreted his election as the dawn of a new age, and his youthful fans invested him with unrealizable millennial expectations. His economic program hasn’t collapsed, but it has badly underperformed and opened up an unsettling vista on a future debt crisis. Even Obama acknowledges his facile assurances of “shovel-ready” stimulus projects were misbegotten. In a remarkable turnabout, his economic team sold the extension of the Bush tax rates as protection against a double-dip recession.
It’s not economic fundamentals that are breaking Obama’s leftward momentum so much as political ones. A center-right country can only take so much hope-and-change. Prior to the arrival of any tea partiers, Harry Reid’s Senate couldn’t pass a $1.1 trillion business-as-usual spending bill, and Nancy Pelosi’s House ratified the Bush tax cuts in a bipartisan vote. Obama is adjusting to this new political reality rather than raging against it.
Does that prove he’s a pragmatist, not an ideologue? Obama is obviously both. He pushes the country as far left as circumstances will allow. He got the left-most plausible health-care bill through Congress, dropping the public option only when it didn’t have the votes. Then, he got the left-most tax bargain he could wring from Republicans, which wasn’t very left, given the new correlation of political forces.
Obama’s task is to position himself as the reasonable advocate for a more responsible and austere version of the status quo and paint Republicans, who want to overturn Obamacare and reconceive entitlements, as the champions of risky, unsettling structural change. Obama is in a long game. In a prescient Fortune article calling on Obama to make a Mitterrand U-turn after Scott Brown’s victory last January, Shawn Tully noted that after the French president pulled back, “government spending as a share of GDP fell from 52 percent in the mid-1980s to 48 percent by 1990.” Now, spending in France is back to 54 percent of GDP.
If he’s to succeed on his own terms as a pragmatic ideologue, Obama will be as wily and flexible as it takes to get reelected, then protect as much of his state aggrandizement as is feasible. Francois Mitterrand would understand, even if Obama’s disappointed acolytes don’t.
Rich Lowry
Tuesday, December 21, 2010
In 1981, Francois Mitterrand swept to power in France in a watershed election. He united the Left and fired the imagination of the country’s youth, who danced in the streets on election night in a frenzy of revolutionary anticipation.
Mitterrand embarked on a stimulus program that would have satisfied Paul Krugman. He increased the wages of government workers and hired more of them. He boosted the minimum wage and reduced working hours. He tripled the budget deficit. In a year, he nationalized no fewer than 36 banks, along with the country’s largest industrial corporations.
The late historian Tony Judt wrote in his book Postwar that the nationalizations were meant “to symbolize the anti-capitalist intent of the new regime; to confirm that the elections of 1981 had really changed something more than just the personnel of government.” This was “change we could believe in,” taken to Gallic extremes.
Then, the unraveling. With inflation and unemployment at double digits, with the business community terrified, and with currency and people fleeing the country, Mitterrand’s “revolution” foundered on the shoals of economic and social reality. As a matter of sheer survival, he announced a “U-turn” and embraced a program of austerity, or la rigueur, reversing course on nearly everything.
Pres. Barack Obama’s “U-turn” is upon us. It is much more muted. He wasn’t as explicitly left-wing in his campaign or in his initial burst of activism as Mitterrand, and he’ll never go as far in his reversal as the flamboyantly cynical Frenchman. There’s nonetheless a whiff of Mitterrand in the air when Obama marks the extension of all the Bush tax cuts at a White House signing ceremony with Senate Minority Leader Mitch McConnell present, but not House Speaker Nancy Pelosi.
Like Mitterrand’s supporters, Obama’s boosters overinterpreted his election as the dawn of a new age, and his youthful fans invested him with unrealizable millennial expectations. His economic program hasn’t collapsed, but it has badly underperformed and opened up an unsettling vista on a future debt crisis. Even Obama acknowledges his facile assurances of “shovel-ready” stimulus projects were misbegotten. In a remarkable turnabout, his economic team sold the extension of the Bush tax rates as protection against a double-dip recession.
It’s not economic fundamentals that are breaking Obama’s leftward momentum so much as political ones. A center-right country can only take so much hope-and-change. Prior to the arrival of any tea partiers, Harry Reid’s Senate couldn’t pass a $1.1 trillion business-as-usual spending bill, and Nancy Pelosi’s House ratified the Bush tax cuts in a bipartisan vote. Obama is adjusting to this new political reality rather than raging against it.
Does that prove he’s a pragmatist, not an ideologue? Obama is obviously both. He pushes the country as far left as circumstances will allow. He got the left-most plausible health-care bill through Congress, dropping the public option only when it didn’t have the votes. Then, he got the left-most tax bargain he could wring from Republicans, which wasn’t very left, given the new correlation of political forces.
Obama’s task is to position himself as the reasonable advocate for a more responsible and austere version of the status quo and paint Republicans, who want to overturn Obamacare and reconceive entitlements, as the champions of risky, unsettling structural change. Obama is in a long game. In a prescient Fortune article calling on Obama to make a Mitterrand U-turn after Scott Brown’s victory last January, Shawn Tully noted that after the French president pulled back, “government spending as a share of GDP fell from 52 percent in the mid-1980s to 48 percent by 1990.” Now, spending in France is back to 54 percent of GDP.
If he’s to succeed on his own terms as a pragmatic ideologue, Obama will be as wily and flexible as it takes to get reelected, then protect as much of his state aggrandizement as is feasible. Francois Mitterrand would understand, even if Obama’s disappointed acolytes don’t.
Monday, December 20, 2010
U.S. Military Primacy: Worth Sacrificing For
The cost of defense cuts may be measured in lives.
Jack David
Monday, December 20, 2010
Advocates of cuts in U.S. military programs, including President Obama’s Deficit Reduction Commission, argue that the defense budget must be reduced along with all other U.S. programs because of the dangerously increasing government debt and the current bleak economic picture. The Commission recommends the elimination of $100 billion from the defense budget — more than 25 percent of which cuts would have an effect on acquisitions and research.
But such demands duck a fundamental question: whether the consequences of the U.S.’s no longer being the world’s preeminent military force are acceptable. If the answer is “yes,” we will have one kind of future as a nation among nations. If the answer is “no,” we will have another. In my view, programs designed to maintain military primacy should be exempt from cuts even when other items in the budget are not.
What do I mean by U.S. “military primacy”? A good working definition is ”a situation in which U.S. capabilities are so superior that they discourage or deter adversaries from taking action they might otherwise take to the detriment of U.S. interests.” Ideally, the deterrent effect would be so good that the U.S. would never have to actually deploy its military might. It also means that, if the adversary took a risk and acted anyway, the U.S. military could defeat it. The more clear it is at the outset that U.S. military capability is more than a match for the adversary’s force, the greater the likelihood of discouraging or deterring the adversary’s action.
How can we see into a future without U.S. military primacy? One place that points the way is the past. The absence of U.S. military primacy played a large role in North Korea’s invasion of South Korea on June 25, 1950. In 1945, the U.S. had 40,000 soldiers in in South Korea. By 1950, there were a mere 472 there. Consistent with the drawdown, Secretary of State Dean Acheson did not include Korea in a January 1950 speech in which he enumerated countries that the U.S. would defend.
Kim Il Sung concluded that the U.S. would not interfere with his plan to unify the peninsula by force. He persuaded Stalin and Mao of that view, secured their promises of support, and invaded the South. Hundreds of thousands died in the conflict. The U.S. suffered 33,746 combat deaths and 128,650 total dead and wounded. In economic terms, the war cost $67 billion in 1953 dollars, equal to $535 billion in 2008 dollars. Once the Korean War started, the U.S. defense budget was quadrupled.
In evaluating whether U.S. military programs can be eliminated without imperiling military primacy, it is necessary first to consider what potential adversaries are saying and doing and how their actions will affect the U.S. Two countries of enormous importance in this regard are Russia and China.
While our relations with Russia today are not as hostile as they were with the Soviet Union (thankfully), Russia’s reassertion of rights in territories the Soviet Union once occupied is worrisome. Russian air-force fighters already are comparable to the U.S. mainstay, the F15. Russia is developing fighter aircraft comparable to our now-incomparable F22 (production of which has been terminated to save money), and it is continuing to develop nuclear-weapon and other military capabilities explicitly intended to be superior to ours and to defeat us in any conflict.
China long has made territorial claims on the regions surrounding it. Some of these are in areas in the western Pacific claimed by other countries. Others are in what the U.S. regards as international waters. It is no secret that China is aggressively building a blue-water navy, has F15-comparable fighters in its own air force, and already is testing an F22-comparable aircraft that will be deployed in very few years. Moreover, the ships, aircraft, missiles, and space and cyber capabilities China is developing, like those of the Russians, are explicitly being designed to defeat U.S. air, naval, and space military capabilities.
These facts are significant. They demonstrate elements of U.S. primacy from the perspective of Russia and China, showing what U.S. military resources they regard as impeding their plans. They also show that Russia and China believe there is a significant possibility that they will want to use military force to achieve an objective contrary to U.S. interests.
As was the case in Korea in 1953, U.S. military weakness in the late 1930s eased the way for Nazi aggression and invited Japan’s attack on Pearl Harbor. At that time, the U.S. military was not remotely prepared for the war. Had the U.S. not been as isolationist and had it spent what was necessary in the 1920s and 1930s to assure itself of military primacy, perhaps Japan and Germany would not have started what became World War II, a war in which 70 million people, including 405,399 Americans, died, and which cost us $337 billion in early-1940s dollars.
There is no way to predict with confidence whether Russia or China will use the military power it is developing to resolve differences with other countries, although there is ample evidence that each may. But we can be sure that, if a U.S. interest is involved — Japan or Taiwan, for example — both would consider the U.S.’s military capability before initiating a major military operation. If that happens, what would we do? Would we capitulate to our adversary’s demands, whatever they may be? Would we deploy our military forces in the hope of prevailing? If our military forces prevail, how would we feel about the human and economic costs we suffered in the conflict? Perhaps this scenario was what Defense Secretary Robert Gates had in mind last month when he told the co-chairs of the Deficit Reduction Commission that a 10 percent cut in the defense budget would be a “catastrophe.”
“If you want peace, prepare for war.” That advice, which dates back to the time of the Roman Empire, applies today. The U.S. has preserved its political and economic freedom, and the political and economic freedom of its friends, by maintaining military primacy since the 1950s. We must continue to do so.
Jack David
Monday, December 20, 2010
Advocates of cuts in U.S. military programs, including President Obama’s Deficit Reduction Commission, argue that the defense budget must be reduced along with all other U.S. programs because of the dangerously increasing government debt and the current bleak economic picture. The Commission recommends the elimination of $100 billion from the defense budget — more than 25 percent of which cuts would have an effect on acquisitions and research.
But such demands duck a fundamental question: whether the consequences of the U.S.’s no longer being the world’s preeminent military force are acceptable. If the answer is “yes,” we will have one kind of future as a nation among nations. If the answer is “no,” we will have another. In my view, programs designed to maintain military primacy should be exempt from cuts even when other items in the budget are not.
What do I mean by U.S. “military primacy”? A good working definition is ”a situation in which U.S. capabilities are so superior that they discourage or deter adversaries from taking action they might otherwise take to the detriment of U.S. interests.” Ideally, the deterrent effect would be so good that the U.S. would never have to actually deploy its military might. It also means that, if the adversary took a risk and acted anyway, the U.S. military could defeat it. The more clear it is at the outset that U.S. military capability is more than a match for the adversary’s force, the greater the likelihood of discouraging or deterring the adversary’s action.
How can we see into a future without U.S. military primacy? One place that points the way is the past. The absence of U.S. military primacy played a large role in North Korea’s invasion of South Korea on June 25, 1950. In 1945, the U.S. had 40,000 soldiers in in South Korea. By 1950, there were a mere 472 there. Consistent with the drawdown, Secretary of State Dean Acheson did not include Korea in a January 1950 speech in which he enumerated countries that the U.S. would defend.
Kim Il Sung concluded that the U.S. would not interfere with his plan to unify the peninsula by force. He persuaded Stalin and Mao of that view, secured their promises of support, and invaded the South. Hundreds of thousands died in the conflict. The U.S. suffered 33,746 combat deaths and 128,650 total dead and wounded. In economic terms, the war cost $67 billion in 1953 dollars, equal to $535 billion in 2008 dollars. Once the Korean War started, the U.S. defense budget was quadrupled.
In evaluating whether U.S. military programs can be eliminated without imperiling military primacy, it is necessary first to consider what potential adversaries are saying and doing and how their actions will affect the U.S. Two countries of enormous importance in this regard are Russia and China.
While our relations with Russia today are not as hostile as they were with the Soviet Union (thankfully), Russia’s reassertion of rights in territories the Soviet Union once occupied is worrisome. Russian air-force fighters already are comparable to the U.S. mainstay, the F15. Russia is developing fighter aircraft comparable to our now-incomparable F22 (production of which has been terminated to save money), and it is continuing to develop nuclear-weapon and other military capabilities explicitly intended to be superior to ours and to defeat us in any conflict.
China long has made territorial claims on the regions surrounding it. Some of these are in areas in the western Pacific claimed by other countries. Others are in what the U.S. regards as international waters. It is no secret that China is aggressively building a blue-water navy, has F15-comparable fighters in its own air force, and already is testing an F22-comparable aircraft that will be deployed in very few years. Moreover, the ships, aircraft, missiles, and space and cyber capabilities China is developing, like those of the Russians, are explicitly being designed to defeat U.S. air, naval, and space military capabilities.
These facts are significant. They demonstrate elements of U.S. primacy from the perspective of Russia and China, showing what U.S. military resources they regard as impeding their plans. They also show that Russia and China believe there is a significant possibility that they will want to use military force to achieve an objective contrary to U.S. interests.
As was the case in Korea in 1953, U.S. military weakness in the late 1930s eased the way for Nazi aggression and invited Japan’s attack on Pearl Harbor. At that time, the U.S. military was not remotely prepared for the war. Had the U.S. not been as isolationist and had it spent what was necessary in the 1920s and 1930s to assure itself of military primacy, perhaps Japan and Germany would not have started what became World War II, a war in which 70 million people, including 405,399 Americans, died, and which cost us $337 billion in early-1940s dollars.
There is no way to predict with confidence whether Russia or China will use the military power it is developing to resolve differences with other countries, although there is ample evidence that each may. But we can be sure that, if a U.S. interest is involved — Japan or Taiwan, for example — both would consider the U.S.’s military capability before initiating a major military operation. If that happens, what would we do? Would we capitulate to our adversary’s demands, whatever they may be? Would we deploy our military forces in the hope of prevailing? If our military forces prevail, how would we feel about the human and economic costs we suffered in the conflict? Perhaps this scenario was what Defense Secretary Robert Gates had in mind last month when he told the co-chairs of the Deficit Reduction Commission that a 10 percent cut in the defense budget would be a “catastrophe.”
“If you want peace, prepare for war.” That advice, which dates back to the time of the Roman Empire, applies today. The U.S. has preserved its political and economic freedom, and the political and economic freedom of its friends, by maintaining military primacy since the 1950s. We must continue to do so.
Labels:
America's Role,
Ignorance,
National Defense,
Policy,
Recommended Reading
Subscribe to:
Posts (Atom)