Tuesday, July 31, 2012

Learning to Be an American

By Charles C. W. Cooke
Tuesday, July 31, 2012

Recently, John Sununu apologized for saying that he wished that “this president would learn how to be an American.” Whether he should have walked back his statement is up for debate. But, that particular incident aside, the notion that there is such a thing as “an American” and that one can be good or bad at being one is not self-evidently a ridiculous idea, as some have made it out to be.

I am not an American but a British subject living in America. I could, however, become an American. If I did, what would that mean? To some, perhaps, it would merely mean that I had conformed to the laws dictating how long I had to be in the country before I could be naturalized, and then that I had asked the United States Citizenship and Immigration Services to do its thing and issue me an official piece of paper. Certainly, this it how it works in most countries: There might be some basic rules that applicants must follow, but, beyond the strict legal meaning of the transition, there is little else. Were I, for example, to move to India, I am sure that I could become a citizen of that country if I so wished. But I would not become an Indian. This is not so in America, and to observe the distinction is relatively uncontroversial. “Being an American,” it seems reasonable to suggest, is much more than getting hold of the right paperwork and being physically present or — in the case of most Americans — being born into it.

So what is it? Well, it’s certainly nothing to do with race. The American doctrine that “all men are created equal,” as laid out so elegantly in the Declaration of Independence, quickly puts paid to that. It is this that made the evils of slavery, segregation, and other forms of racism so acutely intolerable in the United States, for it is one thing to be racist in a country defined only by its borders but quite another to be so in a country defined by its principles. “All men are created equal” is a fact of nature, but it is also a proposition that many still reject; and the degree to which one subscribes to it is closely related to how good one is at being an American. There are terrible Americans who were born in the United States and great Americans who were born abroad. Paul Johnson, who wrote a wonderful History of the American People, was born in England, but he understands the country perfectly; Howard Zinn, who was born in Brooklyn, does not.

There are a host of similar American propositions, and most of them are fully testable. This is why America has a citizenship test. Would it not be “un-American,” for example, to oppose free speech? One has to understand the axiom and vow to uphold it in order to be naturalized not simply because it is the law of the land, but because it is a foundational principle without which the American idea ultimately cannot operate. This and the other core principles are neatly outlined in the national guidebooks, which include the Declaration of Independence, the Constitution, the Federalist Papers, the Gettysburg Address, and so forth. Such works have made the world intimately familiar with the propositions of the American project and have acted as a magnet to immigrants from all over the globe. In contradistinction, ask somebody what Belgium is for and they will be hard-pressed to answer you — there is no such thing as the Belgian “promise” or the Belgian “dream,” and those who spoke of such things would be looked at with reasonable suspicion.

So prominent are ideas in America that they are put on the money: “In God We Trust,” “E Pluribus Unum,” and “Liberty” are — literally — forged into the currency of the nation. In Britain, by way of contrast — a nation that helped write America’s values and then largely abandoned them at home — the money features a picture of the Queen, some functional words, and a few decorations. This difference is important.

In an episode of Da Ali G Show, the fictional character Borat interviews an American and asks her why America is the best country in the world:

Borat: “Which country is the number one in the world?”
American: “I think, right now the US.”
Borat: “Don’t you think maybe Kazakhstan is the number one?”
American: “No.”
Borat: “But we have a man with the biggest amount of fingers. He has eight fingers. Do you have it?”
American: “Does he have the right to vote? The freedom to speak?”
Borat: “Weeell . . . Not so much. But we have the biggest goat in the world. Oh no. Hungary has number one. But US has number five. Are we number one country now?”

True to Sacha Baron Cohen’s style, this is heavy-handed. But it strikes at something important. Borat proudly lists many of the commendable (albeit fictional) virtues of his country — “Kazakhstan number one exporter of potassium!” — and the American calmly reminds him that American ideas are why the country transcends all others. Is it too radical to propose that national greatness thus relies upon people following these very ideas?

Reflexive, frivolous, and opportunistic charges of “racism” aside, the reason that Sununu stirred such controversy with his comment about Obama’s learning to be American was that it dealt with something not explicitly articulated in any of the founding documents. As I understand it, the outcry against Sununu derives, at least in part, from the fact that he was criticizing Obama for not being a very good capitalist — and that, per Oliver Wendell Holmes, “capitalism” and “America” are not interchangeable. I’m not at all convinced of that. Capitalism is the only economic system compatible with the form of government laid out in the Constitution. And, even if capitalism is not enumerated in that document, the role of government is. You really cannot have American constitutional government with a different economic system. Progressives ultimately know this, which is why they disdain the charter and seek fundamentally to transform it.

So uncontroversial is this notion that the citizenship test explicitly asks which system of economics the United States enjoys: The correct answer is “capitalism.” I would argue that, if it is reasonable to potentially deny people citizenship based on their failure to understand this tenet of the republic, then it is also reasonable to judge someone’s capacity to be a good or a bad American by the same token.

Abraham Lincoln started his Gettysburg Address with these words:

Four score and seven years ago our fathers brought forth, on this continent, a new nation, conceived in Liberty, and dedicated to the proposition that all men are created equal.

Now we are engaged in a great civil war, testing whether that nation, or any nation so conceived, and so dedicated, can long endure.

Lincoln was fighting both to keep the union intact and to rid the nation of slavery. But he also understood acutely that, if America disappeared, so did its underlying ideas, which is why he finished his short oration with the earnest hope that “government of the people, by the people, for the people, shall not perish from the earth.” He was correct to so worry. If the Union had lost the war, then others could well have interpreted the Civil War as living proof that a republic built on presumptions of liberty simply could not persist. As such, America’s survival was important not only to Americans but to all free people.

F. Scott Fitzgerald put it this way in “The Swimmers”:

France was a land, England was a people, but America, having about it still that quality of the idea, was harder to utter — it was the graves at Shiloh and the tired, drawn, nervous faces of its great men, and the country boys dying in the Argonne for a phrase that was empty before their bodies withered. It was a willingness of the heart.

Ideas require willing, and some hearts are more willing than others.

Romney’s Truth Telling

By Rich Lowry
Tuesday, July 31, 2012

Sometimes the world seems a little smaller.

Mitt Romney’s trip to Israel was such a moment. Demonstrating rhetorical instincts that should get him booked routinely on MSNBC, Palestinian official Saeb Erekat denounced the Republican candidate as a racist. It was a heartening display of the commonality of Romney critics. Whatever their varied backgrounds or interests, they all speak in one voice when it comes to attributing racial hatred to the former Massachusetts governor.

In Jerusalem, Romney’s offense was noting at a fundraiser the starkly different economic performances of Israel and the Palestinian Authority. Israel has a per capita GDP of $31,000, the West Bank and Gaza just $1,500. “As I come here and I look out over this city and consider the accomplishments of the people of this nation,” Romney said, “I recognize the power of at least culture and a few other things.”

At that, Erekat pounced. “It is a racist statement,” he huffed, “and this man doesn’t realize that the Palestinian economy cannot reach its potential because there is an Israeli occupation.” Judging by his performance, Erekat is almost as good at calling Republicans racist as people in the United States who do it for a living. His understanding of the fundamentals of economic growth and the Palestinian predicament isn’t as impressive, though. Otherwise he wouldn’t get the vapors at the mention of the word “culture,” or the suggestion that contemporary Palestinian culture is lacking.

Erekat evidently hasn’t read much Tocqueville — and that’s a cultural deficiency right there. “I am convinced,” Tocqueville wrote, “that the luckiest of geographic circumstances and the best of laws cannot maintain a constitution in despite of mores, whereas the latter can turn even the most unfavorable circumstances and the worst laws to advantage.”

Tocqueville’s classic Democracy in America is in part a study of how the cultural inheritance of this country shaped it. Even before they came here, even before they were Americans, emigrants from England were used to an active civic life, trial by jury, freedom of speech and the press, individual rights and their assertion. “They carried these free institutions and virile mores with them to America,” Tocqueville writes, “and these characteristics sustained them against the encroachments of the state.” They made themselves, as a result, wealthy and free.

This has nothing to do with race. The author Lawrence Harrison wrote a book called The Central Liberal Truth on the interplay of culture and development. In it, he notes the difference between the basket case of Haiti and relatively well-governed, well-off Barbados. Both are populated by the descendants of slaves from West Africa. Both were European sugar colonies. The difference is that Haiti won its independence from France early in the 19th century, while Barbados steadily absorbed British values and institutions until it eventually gained its independence in 1966.

Israel is part of the culture of the West, as can be seen in its commitment to democracy, the rule of law, and individual achievement. Romney adviser Dan Senor and Saul Singer co-wrote a book on Israeli entrepreneurship called Start-up Nation. “It is a story,” they write, “not just of talent but of tenacity, of insatiable questioning of authority, of determined informality, combined with a unique attitude toward failure, teamwork, mission, risk, and cross-disciplinary creativity.” In short, it is a story of a culture of entrepreneurship.

Yes, the Palestinians are hobbled by Israeli roadblocks and the like. But they are crippled by the fact that they live in an illiberal society obsessed with perpetuating the conflict with Israel over almost all else. Lawrence Harrison cites many examples of countries that have undergone cultural change — from South Korea to Ireland — under farsighted leadership and the pressure of events. Change is particularly difficult, though, when the need for it is “brought home by the strengths of other cultures that have achieved higher levels of progress.”

In Jerusalem, Romney only said what was obvious. Meaningful change won’t come for the Palestinians until they admit it.

Monday, July 30, 2012

Britain’s NHS: No Fun and Games

By John Fund
Sunday, July 29, 2012

The International Olympic Committee decided not to include in the opening ceremony a moment of silence to honor the eleven Israeli athletes killed by Palestinian gunmen during the 1972 games in Munich. That move drew the ire of NBC’s Bob Costas. During Friday’s ceremony, he commented that, although a private moment of silence was held before a mere 100 people this week at the Athlete’s Village, “for many, tonight, with the world watching, is the true time and place to remember those who were lost and how and why they died.”

Instead, the Olympic ceremony featured a weird, politically correct extravaganza by film director Danny Boyle (Slumdog Millionaire). It was hailed by the sports website The Roar with the headline “London 2012: Most political Olympics opening ceremony since Berlin 1936.” The 1936 games, of course, were an infamous propaganda exercise for Adolf Hitler.

For The Roar’s Spiro Zanos, “the political message at London was that Britain could recover its greatness and become Great Britain once again if . . . [it] re-embraced the radical politics that unleashed the industrial revolution and the welfare state. . . . If this means having the most political opening ceremony since the Berlin Olympics in 1936, then so be it.” The state-worship so ably skewered by Jonah Goldberg in Liberal Fascism is alive and well.

The Boyle ceremony got underway with images of a bucolic Britain being swept away by a cigar-chomping elite that builds satanic mills filled with oppressed workers as steeplejacks hang from the towering chimneys. Later, 600 doctors and patients recruited from National Health Service hospitals were featured in a bizarre tribute to socialized medicine, with children bouncing up and down on 320 hospital beds arrayed in front of a giant Franken-baby wrapped in bandages. Villains from British children’s literature, ranging from Cruella de Vil to Lord Voldemort, sweep in on the children, in an apparent reference to conservative forces seeking to reform the tottering NHS. The 15-minute sequence ended with a series of red lights triumphantly spelling out “NHS.”

Left-wingers were thrilled. “Brilliant that we got a socialist to do the opening ceremony,” tweeted Alastair Campbell, former communications chief for the Labour party. Boyle denied he was promoting a political agenda. “The sensibility of the show is very personal,” he told reporters. “We had no agenda other than . . . values that we feel are true.” At a news conference beforehand, he explained that one of the reasons he “put the NHS in the show is that everyone is aware of how important NHS is to everybody in the country. One of the core values of our society is that it doesn’t matter who you are, you will get treated the same in terms of health care.”

Can anyone seriously believe that? Sunday’s British papers report that a study by the research firm Lloyd’s TSB Premier Banking found that nearly two-thirds of Britons earning more than $78,700 a year have taken out private health insurance because they don’t trust the NHS. A survey by the British health-care organization Bupa found that two-thirds of its customers cited the risk of infection from superbugs as a top reason for buying private insurance. Shaun Matisonn, the chief executive of PruHealth, says that “patients today are sophisticated consumers of health care. They research the treatments they want, but cannot always get them through the NHS.”

Horror stories about the NHS abound. A 2007 survey of almost 1,000 physicians by Doctors’ Magazine found that two-thirds said they had been told by their local NHS trust not to prescribe certain drugs, and one in five doctors knew patients who had suffered as a result of treatment rationing. The study cited one physician who characterized the NHS as “a lottery.” A new study this year by GP magazine supports that conclusion. Through Freedom of Information Act records, it found that 90 percent of NHS trusts were rationing care.

Rick Dewsbury of the Daily Mail was aghast at the worship of the NHS during Friday’s Olympic ceremony. The columnist noted the sheer hypocrisy of the spectacle, as “the majority of the athletes taking part in the Games will have access to the most expensive cutting-edge private treatment available in the world for even the slightest graze on their bodies.”

Dewsbury recounted the 2009 case of Kane Gorny, a 22-year-old NHS patient. Gorny was admitted to the hospital for a hip replacement. A series of hospital employees refused his request for a glass of water and failed to give him diabetes medication. He went so far as to call the emergency operator for help. When the police arrived, nurses assured them that Gorny was confused and needed no outside help. A day later, he was dead of dehydration. The official inquest into his death was published this month. It found that neglect by hospital staff — “a cascade of individual failures” — contributed to his death. Here’s hoping that not everyone is “treated the same” in Britain’s NHS hospitals.

In Britain, we have seen what could be our future, and it’s not a pretty sight.

Busted: Mr. Pfeiffer and the White House Blog

By Charles Krauthammer
Sunday, July 29, 2012

Shortly after 9/11, President George W. Bush received from Prime Minister Tony Blair a bust of Winston Churchill as an expression of British-American solidarity. Bush gave it pride of place in the Oval Office.

In my Friday column about Mitt Romney’s trip abroad and U.S. foreign policy, I wrote that Barack Obama “started his presidency by returning to the British Embassy the bust of Winston Churchill that had graced the Oval Office.”

Within hours, White House Communications Director Dan Pfeiffer had created something of a bonfire. Citing my statement, he posted a furious blog on the White House website, saying: “Normally we wouldn’t address a rumor that’s so patently false, but just this morning the Washington Post’s Charles Krauthammer repeated this ridiculous claim in his column. . . . This is 100% false. The bust [is] still in the White House. In the Residence. Outside the Treaty Room.”

Except that it isn’t. As the British Embassy said in a statement issued just a few hours later, “The bust now resides in the British ambassador’s residence in Washington, D.C.”

As the British Embassy explained in 2009: “[The bust] was lent for the first term of office of President Bush. When the president was elected for his second and final term, the loan was extended until January 2009. The new president has decided not to continue this loan and the bust has now been returned.”

QED.

At which point, one would expect Pfeiffer to say: Sorry, I made a mistake. End of story.

But Pfeiffer had an additional problem. In his original post, he had provided photographic proof of his claim that the Oval Office Churchill had never been returned, indeed had never left the White House at all, but had simply been moved from the Oval Office to the residence.

“Here’s a picture of the president showing off the Churchill bust to Prime Minister Cameron when he visited the White House residence in 2010,” he wrote. “Hopefully this clears things up a bit and prevents folks from making this ridiculous claim again.”

Except that the photo does nothing of the sort. The Churchill sculpture shown in the photograph is a different copy — given to President Lyndon Johnson, kept in the White House collection for half a century and displayed in the White House residence. The Oval Office Churchill — the one in question, the one Pfeiffer says never left the White House — did leave the White House, was returned to the British government, and sits proudly at this very moment in the British ambassador’s residence.

Was that little photographic switcheroo an honest mistake on Pfeiffer’s part? Or was it deliberate deception? I have no idea. But in either case, the effect was to deceive Pfeiffer’s readers into believing that my assertion about the removal of the Oval Office Churchill was “patently false . . . ridiculous . . . 100% false.”

The decent thing to do, therefore, would be to acknowledge the (inadvertent?) deception and apologize for it. He could send the retraction to the New York Times editorial-page editor Andrew Rosenthal, who at first repeated Pfeiffer’s denunciation of the Churchill bust “falsehood” and then later honorably corrected himself, admitting, “I got some facts wrong, because I made the mistake of relying on a White House blog post by the communications director Dan Pfeiffer.” Rosenthal then chided Pfeiffer because, after the facts became clear, he posted “a weaselly follow-up comment” that “fails to acknowledge that his post . . . was false.”

In my view, this whole affair was entirely unnecessary. Pfeiffer devoted an entire post (with accompanying photograph) on the White House blog to a single sentence in a larger argument about foreign policy, and he blew it up into an indignant defense of truth itself and a handy club with which to discredit the credibility of a persistent critic of his boss. (After all, why now? Why this column? Since the return of the Oval Office Churchill in 2009, that fact had been asserted in at least half a dozen major news outlets, including Newsweek, CBS News, ABC News, the Telegraph, and the Washington Post.)

So I suggest Mr. Pfeiffer bring this to a short, painless, and honorable conclusion: a simple admission that he got it wrong and that my assertion was correct. An apology would be nice, but given this White House’s arm’s-length relationship with truth — and given Ryan Zimmerman’s hot hitting — I reckon the Nationals will win the World Series before I receive Pfeiffer’s mea culpa.

The Drought, and GMO Crops

By Jeff Carter
Monday, July 30, 2012

As anyone knows, we have had very little rain this year. At the end of this week, the market will get the official USDA estimates of what the corn crop looks like. Private estimates will be available during the week.

Right now, the numbers I am hearing are around 122 bushels per acre. That’s woefully short.

A thought occurred to me though. Some seed companies ($MON) have been working on Genetically Modified seed, or GMO seed. Seed companies have genetically modified seed to grow differently, ripen at different times, and have pesticide and bug resistance. They also have been creating seed that can grow with very little water.

Droughts were at the top of their list when creating this type of seed. The fact is, fresh water is a precious resource in the world, and the less we need for crops mean the more we can have for animals and people.

I checked on Twitter and tried to find tweets about GMO corn and the drought. The engineered seed won’t be released until 2013. It is only in experimental fields this year. Frankly, any seed, heirloom or engineered would have a tough time growing when there isn’t a drop of rain!

The folks against GMO seed released an article though and tweeted it out. They are pushing the story that breeding and better farm practices are what’s needed to protect crops from drought and keep yields up.

How is breeding any different than GMO?

I don’t have dog in the fight. I support both kinds of farming, and think everything ought to be clearly labeled. The FDA prohibits huge industries from being formed through bad regulation. Clearly, they are limiting competition and job creation.

I don’t think GMO crops cause cancer or anything like that. I doubt seriously any big commodity crop we have today in any field resembles the crops we had at the turn of the century. Cross pollination, nature, forced breeding, and science have taken care of that.

When the new Monsanto seed is planted broadly in 2013, I hope we don’t have to see a drought like this years to see if it works.

Friday, July 27, 2012

Death Penalty Foes Won’t Take a Stand in Colorado

By Jonah Goldberg
Friday, July 27, 2012

In the aftermath of the Aurora, Colo., slaughter, the question went forth on all of the political chatter shows: "Will this reopen the debate over gun control?"

That's the script. When heinous monsters kill people with guns, we tend to talk about the problem of guns. Or rather, people in Washington, New York and other big cities tend to talk about the problem of guns, because they think guns are the problem. There's an irony there, of course, given that such cities tend to have the worst gun-related murder rates -- Chicago these days has the equivalent of an Aurora every month -- and they are the places where guns are hardest to come by, legally.

Regardless, the gun debate flashed for the briefest of moments, like a round of heat lightning that fails to herald a storm, and then disappeared.

Instead, the conversation has moved to other familiar topics. What to do about the mentally ill? How much blame does our violent popular culture deserve? Etc.

These are good questions. But you know what debate seems conspicuously absent? Should we execute James Holmes?

Death penalty opponents are fairly mercenary about when to express their outrage. When questions of guilt can be muddied in the media; when the facts are old and hard to look up; when the witnesses are dead; when statistics can be deployed to buttress the charge of institutional racism: These are just a few of the times when opponents loudly insist the death penalty must go.

But when the murderer is white or racist or his crimes so incomprehensibly ugly, the anti-death-penalty crowd stays silent. It's the smart play. If your long-term goal is to abolish the death penalty, you want to pick your cases carefully.

But the simple fact is, if the death penalty is always wrong, it's wrong in the politically inconvenient cases too.

The standards of newspaper writing and civic discourse require that we call Holmes the "alleged" culprit in this horrific slaughter. That's fine, but if the facts are what we've been told they are, then we know this man is guilty and the jury will not have a hard time saying so.

We don't know whether or not he's mentally ill, but odds are he isn't. Indeed, criminologists and psychiatrists will tell you that most mass murderers aren't insane. But the public debate is already caught up in a familiar tautology. What Holmes did was an act of madness, therefore he must be a madman. And if he's a madman, we can't execute him because he's not responsible for his actions. And if he's not responsible, then "society" must be. And we can't execute a man for society's sins. So: Cue the debate about guns, and funding for mental health, and the popular culture.

Well, I say enough. I favor the death penalty. I don't support killing insane or mentally disabled people who are truly not responsible for their actions, but I don't believe that committing an "act of madness" necessarily makes you a madman. But committing an act of wanton evil makes you an evil man.

Evil and madness are not synonyms. Societies that cannot distinguish between the two are destined to get more of both.

If the death penalty is always wrong, let us have an argument about James Holmes, a man many Americans are aware of, informed about and interested in. Let us hear why the inequities of the criminal justice system require his life be spared. Fight the death penalty battle on this battlefield.

That won't happen. It won't happen in part because nobody on the Sunday talk shows wants to debate the death penalty when the case for it is strong. They like cases that "raise troubling questions about the legitimacy of the death penalty," not cases that affirm the legitimacy of the death penalty.

But it also won't happen because death penalty opponents understand that when the murderer is unsympathetic, the wise course is to hold your tongue until the climate improves.

It remains an open question whether Colorado will seek the death penalty. Prosecutors know that doing so would add years and millions of dollars in extra costs because opponents have so gummed up the legal works. That way they can complain about the outrageous costs of a mechanism they themselves have worked to make prohibitively expensive.

I say, let us give Holmes a fair trial. If convicted, execute him swiftly. If you disagree, explain why this man deserves to live.

‘Military-Style Weapons’

By John R. Lott Jr.
Friday, July 27, 2012

‘AK-47s belong in the hands of soldiers, not on the streets of our cities,” President Obama told the National Urban League on Wednesday. After the deadly attack in Colorado last Friday, the president’s concern is understandable. However, even — or perhaps especially — at such a time, distinctions need to be made.

The police in Aurora, Colo., reported that the killer used a Smith & Wesson M&P 15. This weapon bears a cosmetic resemblance to the M-16, which has been used by the U.S. military since the Vietnam War. The call has frequently been made that there is “no reason” for such “military-style weapons” to be available to civilians.

Yes, the M&P 15 and the AK-47 are “military-style weapons.” But the key word is “style” — they are similar to military guns in their aesthetics, not in the way they actually operate. The guns covered by the federal assault-weapons ban (which was enacted in 1994 and expired ten year later) were not the fully automatic machine guns used by the military but semi-automatic versions of those guns.

The civilian version of the AK-47 uses essentially the same sorts of bullets as deer-hunting rifles, fires at the same rapidity (one bullet per pull of the trigger), and does the same damage. The M&P 15 is similar, though it fires a much smaller bullet — .223 inches in diameter, as opposed to the .30-inch rounds used by the AK-47.

The Aurora killer’s large-capacity ammunition magazines are also misunderstood. The common perception that so-called “assault weapons” can hold larger magazines than hunting rifles is simply wrong. Any gun that can hold a magazine can hold one of any size. That is true for handguns as well as rifles. A magazine, which is basically a metal box with a spring, is also trivially easy to make and virtually impossible to stop criminals from obtaining.

Further, the guns in a couple of recent mass shootings (including the one in Aurora) have jammed because of the large magazines that were used. The reason is simple physics. Large magazines require very strong springs, but the springs cannot be too strong, or it becomes impossible to load the magazines. Over time, the springs wear out, and when a spring loses its ability to push bullets into the chamber properly, the gun jams. With large springs, even a small amount of fatigue can cause jams.

If Obama wants to campaign against semi-automatic guns based on their function, he should go after all semi-automatic guns. After all, in 1998, as an Illinois state senator, he supported just such a ban – a ban that would eliminate most of the guns in the United States.

But despite Obama’s frightening image of military weapons on America’s streets, it is pretty hard to seriously argue that a new ban on “assault weapons” would reduce crime in the United States. Even research done for the Clinton administration didn’t find that the federal assault-weapons ban reduced crime.

Indeed, banning guns on the basis of how they look, and not how they operate, shouldn’t be expected to make any difference. And there are no published academic studies by economists or criminologists that find the original federal assault-weapons ban to have reduced murder or violent crime generally. There is no evidence that the state assault-weapons bans reduced murder or violent-crime rates either. Since the federal ban expired in September 2004, murder and overall violent-crime rates have actually fallen. In 2003, the last full year before the law expired, the U.S. murder rate was 5.7 per 100,000 people. Preliminary numbers for 2011 show that the murder rate has fallen to 4.7 per 100,000 people.

In fact, murder rates fell immediately after September 2004, and they fell more in the states without assault-weapons bans than in the states with them.

Nevertheless, the fears at the time were significant. An Associated Press headline warned, “Gun shops and police officers brace for end of assault weapons ban.” It was even part of the presidential campaign that year: “Kerry blasts lapse of assault weapons ban.” An Internet search turned up more than 560 news stories in the first two weeks of September 2004 that expressed fear about ending the ban. Perhaps unsurprisingly, the fact that murder and other violent crime declined after the ban ended was hardly covered in the media.

If we finally want to deal seriously with multiple-victim public shootings, it is about time that we acknowledge a common feature of these attacks: With just a single exception, the attack in Tucson last year, every public shooting in the U.S. in which more than three people have been killed since at least 1950 has occurred in a place where citizens are not allowed to carry their own firearms. The Cinemark movie theater in Aurora, like others run by the chain around the country, displayed warning signs that it was prohibited to carry guns into the theater.

So President Obama wants to keep guns like the AK-47 “in the hands of soldiers.” But these are not military weapons. No self-respecting military in the world would use them, and it is time for Obama to stop scaring the American people.

Romney Abroad

By Charles Krauthammer
Thursday, July 26, 2012

A generation ago, it was the three I’s. A presidential challenger’s obligatory foreign trip meant Ireland, Italy, and Israel. Mitt Romney’s itinerary is slightly different: Britain, Poland, and Israel.

Not quite the naked ethnic appeal of yore. Each destination suggests a somewhat more subtle affinity: Britain, playing to our cultural connectedness with the Downton Abbey folks who’ve been at our side in practically every fight for the last hundred years; Poland, representing the “new Europe,” the Central Europeans so unashamedly pro-American; Israel, appealing to most American Jews but also to an infinitely greater number of passionately sympathetic Evangelical Christians.

Unlike Barack Obama, Romney abroad will not be admonishing his country, criticizing his president, or declaring himself a citizen of the world. Indeed, Romney should say nothing of substance, just offer effusive expressions of affection for his hosts — and avoid needless contretemps, like his inexplicably dumb and gratuitous critique of Britain’s handling of the Olympic Games. The whole point is to show appreciation for close allies, something the current president has conspicuously failed to do.

On the contrary. Obama started his presidency by returning to the British Embassy the bust of Winston Churchill that had graced the Oval Office. Then came the State Department official who denied the very existence of a U.S.-British special relationship, saying: “There’s nothing special about Britain. You’re just the same as the other 190 countries in the world.”

To be topped off by the slap they received over the Falkland Islands, an issue the Brits had considered closed since they repelled the Argentine invasion there 30 years ago. They were not amused by the Obama administration’s studied neutrality between Britain and Argentina, with both a State Department spokesman and the president ostentatiously employing “Malvinas,” the politically charged Argentine name, interchangeably with “Falklands.” (Although the president flubbed it, calling them the “Maldives,” an Indian Ocean island chain 8,000 miles away.)

As for Poland, it was stunned by Obama’s unilateral cancellation of a missile-defense agreement signed with the Bush 43 administration. Having defied vociferous Russian threats, the Poles expected better treatment than to wake up one morning — the 70th anniversary of the Soviet invasion of Poland, no less — to find themselves the victim of Obama’s “reset” policy of accommodation with Russia. So much for protection from Russian bullying, something they thought they had finally gained with the end of the Cold War.

And then there is Israel, the most egregious example of Obama’s disregard for traditional allies. Obama came into office explicitly intent on creating “daylight” between himself and Israel, believing that by tilting toward the Arabs, they would be more accommodating.

The opposite happened. (Surprise!) When Obama insisted on a building freeze in Jerusalem that no U.S. government had ever demanded and no Israeli government would ever accept, the Palestinian Authority saw clear to become utterly recalcitrant. Palestinian president Mahmoud Abbas openly told the Washington Post that he would just sit on his hands and wait for America to deliver Israel.

Result? Abbas refused to negotiate. Worse, he tried to undermine the fundamental principle of U.S. Middle East diplomacy — a negotiated two-state solution — by seeking unilateral U.N. recognition of Palestinian statehood, without talks or bilateral agreements.

In Israel, Romney will undoubtedly say nothing new. He’ll just reiterate his tough talk on Iran’s nuclear program. But I suspect he’ll let the Israelis know privately that contrary to the conventional wisdom that his hawkishness signals his readiness to attack Iranian nuclear facilities, his real intent is to signal that, unlike Obama, he is truly committed to permitting Israel to do what it needs to defend itself. This will be welcome news to a nation that has never asked anyone to fight on its behalf, just a green light to defend itself without impediments or veiled threats from its friends.

Most important, however, is to just show up. That’s 80 percent of life, Woody Allen once noted. No need to say much. Romney’s very presence will make the statement.

To the Israelis: “We understand your unique plight. If and when you do as you must, we will stand by you.” To the Poles: “You can count on the American umbrella. I will never leave you out in the cold.” And to the British: “We are grateful for your steadfast solidarity in awful places like Iraq and Afghanistan. The relationship truly is special.”

“And one more thing. Still have that bust of Churchill?”

Righting his ship late Thursday in London, Romney did say he wants Winnie back in the Oval Office. 

Thursday, July 26, 2012

Iraqi Ironies

By Victor Davis Hanson
Thursday, July 26, 2012

Amid all the stories about the ongoing violence in Syria, the most disturbing is the possibility that Syrian President Bashar Assad could either deploy the arsenal of chemical and biological weapons that his government claims it has, or provide it to terrorists.

There are suggestions that at least some of Assad's supposed stockpile may have come from Saddam Hussein's frantic, 11th-hour efforts in 2002 to hide his own weapons of mass destruction arsenals in nearby Syria. Various retired Iraqi military officers have alleged as much. Although the story was met with general neglect or scorn from the U.S. media, the present director of national intelligence, James Clapper, long ago asserted his belief in such a weapons transfer.

The Bush administration fixated on WMD in justifying the invasion of Iraq while largely ignoring more than 20 other writs to remove Saddam, as authorized by Congress in October 2002. That obsession would come back to haunt George W. Bush when stockpiles of deployable WMD failed to turn up in postwar Iraq. By 2006, "Bush lied; thousands died" was the serial charge of the antiwar left. But before long, such depots may finally turn up in Syria.

Another staple story of the last decade was the inept management of the Iraq reconstruction. Many Americans understandably questioned how civilian and military leaders allowed a brilliant three-week victory over Saddam to degenerate into a disastrous five-year insurgency before the surge finally salvaged Iraq. That fighting and reconstruction anywhere in the Middle East are difficult under any circumstances was forgotten. The press preferred instead to charge that the singular incompetence or malfeasance of Bush, Dick Cheney and Donald Rumsfeld led to the unnecessary costs in American blood and treasure.

But perhaps that scenario needs an update as well. Journalist Rajiv Chandrasekaran's new book, "Little America: The War Within the War for Afghanistan," is a blistering critique of the Obama administration's three-year conduct of the Afghanistan war and its decision to surge troops, chronicling stupid decisions, petty infighting, arrogance and naiveté. In an earlier book on Iraq, Chandrasekaran had alleged that America's Iraq dilemmas were the result of a similarly bungling Bush administration.

So was the know-it-all reporter right then about Iraq, or is he right now about Afghanistan, or neither, or both? And will the media revise their earlier criticism and concede that America's problems in conducting difficult wars in the Middle East are inherent in the vast differences between cultures -- fault lines that likewise have baffled even Barack Hussein Obama, the acclaimed internationalist and Nobel laureate who was supposed to be singularly sensitive to customs in that part of the world?

In 2008, we were told that predator drone attacks, renditions, preventative detentions, military tribunals, the Guantanamo detention center and the surging of troops into difficult wars were all emblematic of Bush's disdain for the Constitution and his overall ineptness as a commander in chief. In 2012, these same continuing protocols are no such thing, but instead valuable antiterrorism tools, and seen as such by President Obama.

For all the biases and incompetence of Nouri al-Maliki's elected government in Iraq, the Middle East's worst dictatorship now seems to have become the region's most stable constitutional government. Given Iraq's elections, the country was relatively untouched by the mass "Arab Spring" uprisings. And despite sometimes deadly Sunni-Shiite terrorist violence and the resurgence of al Qaeda, Iraq's economy, compared with some of the other nations in the Middle East, is stable and expanding.

The overthrow of Saddam was also supposed to be a blunder in terms of grand strategy, empowering our enemies Iran and Syria. True, Saddam's ouster and the subsequent violence may have done that in the short term. But how about long-term, nine years later?

The Assad dynasty seems about to go the way of Egypt's Hosni Mubarak, Tunisia's Zine El Abidine Bin Ali and Libya's Muammar Gadhafi. Syria's grand ally, Iran -- which barely put down popular demonstrations in 2009 -- has never been more isolated and beleaguered as it deals with sanctions, international ostracism and growing unpopularity at home.

Who knows whether Saddam's fall, trial and execution, coupled with the creation of an Iraqi constitutional government, triggered a slow chain reaction against similar Arab tyrannies.

The moral of the story is that history cannot be written as it unfolds. In the case of Iraq, we still don't know the full story of Saddam's WMD, the grand strategic effects of the Iraq war, the ripples from the creation of the Iraq republic, or the relative degree of incompetence of any American administration at war in the Middle East -- and we won't for many years to come.

Fact-Checking the Latest Bain Hysteria

By Avik Roy
Wednesday, July 25, 2012

Journalists have been eager to find something scandalous in Mitt Romney’s private-equity career. As a result, there’s been a fair share of confused reporting about Romney’s Bain Capital days. Such is the case with a set of breathless articles from the Associated Press and Mother Jones, regarding investments made by two of Bain Capital’s subsidiaries, Sankaty Advisors and Brookside Capital. As I used to work at Brookside, I thought it would be worth bringing some perspective to this discussion.

Much of the case against Romney’s business career involves whether or not Bain or its subsidiaries were involved in outsourcing. Now, I happen to think that free trade makes low-income Americans more prosperous by making goods and services less costly. I also think it’s great that people in developing countries can lift themselves up from poverty by selling stuff to us. My friends on the left oppose these things. Fine by me. That debate is outside the scope of this article. What I want to straighten out is another issue: Which of Bain Capital’s investments is it fair to hold Mitt Romney accountable for?

The answer: He is accountable for the investments in which he actually made the decisions. If I have my 401(k) invested in the Fidelity Select Health Care Fund, am I responsible for every decision made by the portfolio manager at Fidelity? Obviously not. The same goes for Mitt Romney.

Much hay has been made of the fact that Romney was the “sole shareholder” of a number of Bain Capital entities. But investment partnerships don’t work like normal corporations. The majority of the returns from Bain’s successful investments went not to Romney but to Bain’s investors, and also to other Bain partners and employees. In addition, as Bain grew, Romney was involved in less and less of the decision making regarding Bain’s investments.

Although Bain Capital is best known for its private-equity work, the firm has created several subsidiaries that work on other types of investments. These include Brookside, a public-equity hedge fund; Sankaty, a fund focused on credit (debt) securities; Bain Capital Ventures, a venture-capital fund (venture capital is, technically, a subtype of private equity); and Absolute Return Capital, a “macro” fund that invests based on global economic trends.

Investment decisions in these other funds were made by the people running those funds. In order to incentivize those individuals to make good decisions, they and their subordinates retained most of the internal compensation that accrued to those funds. Other Bain Capital employees retained a smaller “carried interest” in their returns. For example, gains in the Brookside funds that didn’t go to the fund’s investors were distributed among Brookside partners and employees, and, to a lesser degree, among other Bain partners. This structure is entirely unexceptional within the world of asset management, as anyone in the field will tell you if you bother to ask.

It’s certainly fair to hold Romney accountable for private-equity investments made by Bain prior to Romney’s 1999 departure. And there isn’t anything wrong with what Bain did after he left. But facts are facts.

David Corn’s “exclusive” report for Mother Jones details a 1998 investment that Brookside made in Global-Tech Appliances, a Hong Kong–based appliance manufacturer. This proves, according to Corn, that Mitt Romney was an outsourcer. But Mitt Romney didn’t run Brookside. Indeed, Brookside’s investment decisions were made without Romney’s participation. In the 1990s, Brookside was jointly run by two portfolio managers, Domenic Ferrante and Ed Brakeman. It would be appropriate to assign credit or blame to Ferrante and Brakeman for Brookside’s decisions, but not to Romney.

Similarly, Stephen Braun’s report for the AP and a related article by Adam Serwer et al. for Mother Jones express concern about Sankaty investments. The Mother Jones article harrumphs that a Sankaty fund is based in Bermuda, a “notorious tax haven.” But the desire of U.S. citizens such as Romney to avoid taxes is not the reason that these offshore entities exist. Indeed, U.S. citizens must and do pay taxes on income they receive, regardless of where the income is earned.

The reason asset managers use these offshore entities is that it allows tax-exempt institutions, such as universities and foundations, to avoid paying extra taxes on their investments. The same logic applies to foreign institutions, which have to pay capital-gains taxes in their home countries: They would get taxed twice in certain ways if they invested in a U.S.-domiciled fund.

Similarly, if the Ford Foundation makes a mint by directly owning shares of Apple, the foundation doesn’t pay taxes on its capital gains. But if the foundation invests in a U.S. fund, the foundation will be forced to pay “unrelated business income taxes” on its portfolio. For this reason, any investment fund that seeks the business of tax-exempt institutions must set up an offshore entity.

Sankaty also had a very small stake in Hong Kong’s Global-Tech Appliances — only 48,000 shares, compared with Brookside’s 1.05 million. But Sankaty’s chief investment officer was not Mitt Romney, but rather Jonathan Lavine. Whether you agree or disagree with Lavine’s investment in Global-Tech — nothing wrong with it from my standpoint — it was Lavine who was ultimately responsible for Sankaty’s investment  decision. And Lavine happens to be a top fundraiser for . . . President Obama.

That relevant fact, of course, was not considered newsworthy by either the Associated Press or Mother Jones. If they think Sankaty’s work is so terrible, they should demand that the Obama campaign return Lavine’s donations. To date, they have not.

In sum, here is what you need to know about Bain Capital. Bain sought to invest not only its capital but also its people in turning important American companies around and making them competitive in the global arena. Bain made lots of investments. Some jobs were created, and some were lost. Mitt Romney was involved with some, and not others. Some failed, but many more succeeded. Most important, as dozens of Democrats have averred, it was honorable work.

Yes, Guns Kill, But How Often Are They Used in Self-Defense?

By Larry Elder
Thursday, July 26, 2012

About the tragedy in Aurora, Colo., rapper/actor Ice-T made more sense -- and has a better understanding of the Second Amendment -- than gun-control proponents.

Asked by a London news anchor about America's gun culture, Ice-T said: "Well, I'd give up my gun when everybody does. Doesn't that make sense? ... If there were guns here, would you want to be the only person without one?"

Anchor Krishnan Guru-Murthy, Channel 4 News: "So do you carry guns routinely at home?"

Ice-T: "Yeah, it's legal in the United States. It's part of our Constitution. You know, the right to bear arms is because that's the last form of defense against tyranny. Not to hunt. It's to protect yourself from the police."

Anchor: "And do you see any link between that and these sorts of (Aurora-type) incidents?"

Ice-T: "No. Nah. Not really. You know what I'm saying, if somebody wants to kill people, you know, they don't need a gun to do it."

Anchor: "It makes it easier, though, doesn't it?"

Ice-T: "Not really. You can strap explosives on your body. They do that all the time."

Anchor: "So when there's the inevitable backlash of the anti-gun lobby, as a result of this instance, as there always is--"

Ice-T: "Well, that's not going to change anything. ... The United States is based on guns."

Security experts say a determined killer, willing to give up his own life, cannot be stopped. The odds, however, can be shifted in favor of the victims and would-be victims. How?

In Pearl, Miss., a gunman who killed two students and wounded seven at a high school was stopped by an assistant principal, who rushed to his car and got his gun. The assistant principal, running back with his .45, spotted the rifle-carrying shooter in the parking lot. Ordering the teen to stop, the vice principal held his gun to the shooter's neck until police arrived.

In Salt Lake City, a man purchased a knife in a grocery store, walked outside and stabbed and critically injured two men. He was threatening others, when a store patron with a concealed weapons permit drew his gun, forced the attacker to the ground and held him until police arrived.

In Grundy, Va., a disgruntled student on the verge of his second suspension at Appalachian School of Law shot and killed the dean, a professor and a fellow student. Two students, both off-duty peace officers, ran to their cars, retrieved their guns and used them to halt the attack.

No one knows whether Aurora would have turned out differently had there been an armed patron or two inside the theater. But at the 2007 Virginia Tech shooting, where 32 people died, there was a no-guns policy -- just as, apparently, at the movie theater in Aurora.

For a guaranteed blank stare, ask gun-control proponents how often Americans use guns to defend themselves. They can't tell you, because they don't ask.

Suppose a guy goes to a baseball game. "Honey," his wife asks afterward, "who won the game?" The husband says, "The Dodgers scored four runs." What's missing? Obviously, the wife still knows nothing about the outcome because she knows only one-half of the equation. Well, how can one responsibly discuss "how many people die because of guns" without discussing the other half of the equation -- how many people would not be alive without their defensive use of a gun?

So, how often do Americans use firearms for self-defense?

Criminologist Gary Kleck estimates that 2.5 million Americans use guns to defend themselves each year. Out of that number, 400,000 believe that but for their firearms, they would have been dead.

Professor Emeritus James Q. Wilson, the UCLA public policy expert, says: "We know from Census Bureau surveys that something beyond 100,000 uses of guns for self-defense occur every year. We know from smaller surveys of a commercial nature that the number may be as high as 2 1/2 or 3 million. We don't know what the right number is, but whatever the right number is, it's not a trivial number."

Former Manhattan Assistant District Attorney David P. Koppel studied gun control for the Cato Institute. Citing a 1979-1985 study by the National Crime Victimization Survey, Koppel found: "When a robbery victim does not defend himself, the robber succeeds 88 percent of the time, and the victim is injured 25 percent of the time. When a victim resists with a gun, the robbery success rate falls to 30 percent, and the victim injury rate falls to 17 percent. No other response to a robbery -- from drawing a knife to shouting for help to fleeing -- produces such low rates of victim injury and robbery success."

When asked if additional gun laws would be beneficial or have no effect, most Americans, like Ice-T, get it. They oppose shifting power to the criminal. And they don't need the National Rifle Association to tell them: The only people willing to abide by additional gun laws are the law-abiding.

Wednesday, July 25, 2012

Outrage Is Not an Argument

By Jacob Sullum
Wednesday, July 25, 2012

Hours after last Friday's massacre in Aurora, Colo., New York Mayor Michael Bloomberg demanded that the two major parties' presidential candidates explain how they plan to prevent such senseless outbursts of violence.

"No matter where you stand on the Second Amendment, no matter where you stand on guns, we have a right to hear from both of them concretely," Bloomberg said in a radio interview. "What are they going to do about guns?"

Whether you accept the premise that something must be done about guns, of course, might be influenced by where you stand on the Second Amendment and where you stand on guns. But according to Bloomberg, even people who object to gun control on practical or constitutional grounds are morally obliged to support it. Such arrogant illogic may help explain why public support for new gun restrictions has been falling for two decades.

Consider how the Brady Campaign to Prevent Gun Violence reacted to news that a man had shot 70 people, 12 of them fatally, at a midnight showing of "The Dark Knight Rises." "This tragedy is another grim reminder that guns are the enablers of mass killers and that our nation pays an unacceptable price for our failure to keep guns out of the hands of dangerous people," said the group's president, Dan Gross. "We are outraged."

But outrage is no substitute for rational argument, and the response urged by the Brady Campaign -- a petition demanding that Congress keep guns away from "convicted felons," "convicted domestic abusers," "terrorists" and "people found to be dangerously mentally ill" -- had nothing to do with what happened in Aurora. As far as we know, James Holmes, the 24-year-old former neuroscience graduate student arrested for the murders, has no criminal record, no links to terrorist groups and no psychiatric history that would have disqualified him from owning guns.

Similarly, a New York Times story regretted that Holmes was "unhindered by federal background checks" when he bought ammunition online. Since he passed background checks to buy his pistols, shotgun and rifle, why would a background check for ammunition have stopped him?

Other gun-control advocates focused on the AR-15 rifle used by Holmes, a civilian, semi-automatic version of the M-16. Depending on the details of its design, it might have been covered by the federal "assault weapon" ban that expired in 2004. But such legislation targets guns based mainly on their military appearance, as opposed to features that make a practical difference in the commission of crimes (a purpose for which they are rarely used). It is hard to see how the presence or absence of a bayonet mount, a threaded barrel or a collapsible stock, for instance, matters much for a man shooting unarmed moviegoers in a darkened theater.

Holmes also had large-capacity magazines: one holding 100 rounds for the rifle (which reportedly jammed) and one holding 40 rounds for his .40-caliber Glock pistol. But reinstating the federal ban on magazines holding more than 10 rounds, as recommended by Sen. Frank Lautenberg, D-N.J., would have no impact on a determined killer, since millions of larger magazines are already in circulation. Even if all of them disappeared tomorrow, switching magazines (or weapons) takes just a few seconds -- probably not a crucial consideration when no one is shooting back.

Instead of restricting guns, magazines or ammunition for everyone, why not focus on the tiny percentage of buyers who will use them to commit mass murder? Because there is no reliable way to identify those people before the fact. As Vasilis Pozios, a Detroit psychiatrist who specializes in risk assessment, recently conceded to USA Today, "We're just not good at predicting who does this."

Peter Ahearn, a former FBI agent, made the same point in an interview with The Associated Press. "There's nothing you can do to predict that type of crime," he said. "There's no way you can prevent it."

That message is not reassuring, popular or politically useful. It just happens to be true.

Stubborn Ignorance

By Walter E. Williams
Wednesday, July 25, 2012

Academic intelligentsia, their media, government and corporate enthusiasts worship at the altar of diversity. Despite budget squeezes, universities have created diversity positions, such as director of diversity and inclusion, manager of diversity recruitment, associate dean for diversity, vice president of diversity and perhaps minister of diversity. This is all part of a quest to get college campuses, corporate offices and government agencies to "look like America."

For them, part of looking like America means race proportionality. For example, if blacks are 13 percent of the population, they should be 13 percent of college students and professors, corporate managers and government employees. Law professors, courts and social scientists have long held that gross statistical disparities are evidence of a pattern and practice of discrimination. Behind this vision is the stupid notion that but for the fact of discrimination, we'd be distributed proportionately by race across incomes, education, occupations and other outcomes. There's no evidence from anywhere on earth or any time in human history that shows that but for discrimination, there would be proportional representation and an absence of gross statistical disparities, by race, sex, height or any other human characteristic. Nonetheless, much of our thinking, legislation and public policy is based upon proportionality being the norm. Let's run a few gross disparities by you, and you decide whether they represent what the courts call a pattern and practice of discrimination and, if so, what corrective action you would propose.

Jews are not even 1 percent of the world's population and only 3 percent of the U.S. population, but they are 20 percent of the world's Nobel Prize winners and 39 percent of U.S. Nobel laureates. That's a gross statistical disparity, but are the Nobel committees discriminating against the rest of us? By the way, in the Weimar Republic, Jews were only 1 percent of the German population, but they were 10 percent of the country's doctors and dentists, 17 percent of its lawyers and a large percentage of its scientific community. Jews won 27 percent of Nobel Prizes won by Germans.

Nearly 80 percent of the players in the National Basketball Association in 2011 were black, and 17 percent were white, but if that disparity is disconcerting, Asians were only 1 percent. Compounding the racial disparity, the highest-paid NBA players are black. That gross disparity works the other way in the National Hockey League, in which less than 3 percent of the players are black. Blacks are 66 percent of NFL and AFL professional football players, but among the 34 percent of other players, there's not a single Japanese player. Though the percentage of black professional baseball players has fallen to 9 percent, there are gross disparities in achievement. Four out of the five highest career home run hitters were black, and of the eight times more than 100 bases were stolen in a season, all were by blacks.

How does one explain these gross sports disparities? Might it be that the owners of these multibillion-dollar professional basketball, football and baseball teams are pro-black and that those of the NHL and major industries are racists?

There are some other disparities that might bother the diversity people. Asians routinely get the highest scores on the math portion of the SAT, whereas blacks get the lowest. Men are about 50 percent of the population, and so are women, but there's the gross injustice that men are struck by lightning six times as often as women. The population statistics for South Dakota, Iowa, Maine, Montana and Vermont show that not even 1 percent of their population is black. On the other hand, in states such as Georgia, Alabama and Mississippi, blacks are overrepresented.

Finally, there's a disparity that might figure heavily in the upcoming presidential election. Twenty-four out of the 43 U.S. presidents have been 5 feet 11 inches or taller, above our population's average height. That is not an outcome that would be expected if there were not voter discrimination based upon height. Mitt Romney is 6 feet 2 inches tall, and Barack Obama is 6 feet 1 inch.

A Moment for Olympic Moral Courage

By David Malcolm
Tuesday, July 24, 2012

In 1972, Palestinian terrorists affiliated with terrorist group Black September murdered 11 Israeli Olympic athletes and one American, David Burger, in Munich, Germany. German security forces were unable to rescue the athletes, and the terrorists emerged unscathed as heroes of a terrorist movement that would plague the world for the next forty years.

Western governments, through their pathetic, morally indefensible non-response to the massacre, implicitly and cowardly legitimized terror as a valid means of Palestinian (and on a larger scale, Muslim) political expression. Indeed, Palestinian terrorist leader Yasser Arafat, who personally directed the Munich massacre, received the Nobel Peace Prize in 1994 for his efforts to “create peace in the Middle East.” Similarly, the man many recently believed was the key to the “peace process,” Palestinian Authority president and so-called “moderate” Mahmoud Abbas, was likewise implicated in the Munich massacre.

Forty years later, as thirty-seven Israeli athletes arrive in London to participate in the Olympic Games, President Obama has a golden opportunity to show moral courage where the rest of the world has faltered. President Obama has reportedly backed an effort to hold a minute of silence during the upcoming opening ceremonies at the London Olympics to recognize the Israeli athletes murdered at the Munich Games. For a president who has thus far refused to visit Israel and has publicly snubbed Prime Minister Benjamin Netanyahu, this is a good start.

International Olympic Committee president and moral coward Jacques Rogge quashed the idea, however, saying, “We feel that the Opening Ceremony is an atmosphere that is not fit to remember such a tragic incident.” With respect, the Opening Ceremonies are just such a time.

A simple, yet powerful symbol of remembrance of the Munich massacre is necessary and appropriate, especially in light of last week’s terrorist bombing of an Israeli tour bus in Bulgaria that many suspect was perpetrated by Iran’s hit squad, the Quds Force. To send a global message that America rejects terror and stands with Israel as it does the same, we propose the following.

American and British Olympians should wear an Israeli flag lapel pin on the blazers of their Olympic uniforms. Further, both countries’ teams should carry the Israeli flag alongside their respective flags, the Stars and Stripes and Union Jack. Finally, once both teams have been introduced and have taken their place in the opening ceremonies, they should unilaterally recognize a minute of silence in remembrance of the eleven Israeli athletes and one American athlete who were murdered in Munich.

Beyond this, President Obama must speak. As always, the world looks to America for moral leadership, and President Obama has an ideal opportunity to provide it. He doesn’t need to give another long-winded Cairo speech lamenting America’s sins abroad or drawing a perverse moral equivalence between America and our enemies. Something simple and direct will suffice. It could go something like this:

“Forty years ago, Palestinian terrorists killed eleven Israeli athletes and one American athlete in a senseless act of indefensible evil. In response, the world did nothing, and in the subsequent years, we have failed to pay adequate tribute to those who were killed, and failed to condemn the barbarity of what was done. Today we correct those failures.

Today we say that the murder of those athletes was evil, and we mourn with their families. Many years have passed since that day, and we know that time cannot erase the heartbreak and loss that you feel. But we know that God comforts the grieving, and our prayer is that you know His comfort today. Today we also say that terrorism has no place at the Olympics, or in Israel, or in London, or New York, or anywhere else. America stands with Israel today and vows to do whatever is necessary, whenever necessary, to ensure her safety and security.

“Finally, to those who traffic in terror – who attack innocents on buses in Bulgaria, and in subways in London, and on trains in Spain, and in nightclubs in Jakarta, and in hotels in Mumbai or Jordan – know this: your day of reckoning will come. America will not stand idly by while you terrorize innocents. To those who would test our resolve, hear this: if you do, the full resources of the United States government and our allies will be unleashed against you, and against those who support you. You will find no sanctuary from our justice.

“The United States looks forward to a safe and historic Olympic games. God bless the United States, Britain, Israel and all of the athletes assembled from across the world who inspire us with their excellence. Thank you.”

Tuesday, July 24, 2012

Selective Transparency

By Victor Davis Hanson
Tuesday, July 24, 2012

We are in a transparency mania, but a rather selective sort of one. Bill Clinton, who chose not to tell the truth while under oath and as president, says he is “perplexed” that Mitt Romney did not offer more candor by providing more than a single year’s tax returns. Yet neither Jimmy Carter nor Ronald Reagan released more than one year’s returns. The reformist John McCain released just two.

True, the 2004 Democratic candidate, John Kerry, offered some 20 years of returns; but that gesture meant almost nothing because his billionaire wife, Teresa, supplied the vast majority of the funds that fueled Kerry’s opulent recreational lifestyle — and she kept largely quiet about where her money was banked and invested. Few in the press praised George W. Bush for releasing nine years of tax returns. Even then one could argue “So what?” — given that likely potential candidates can in advance massage their returns through making a bit less money, taking fewer deductions, and giving a little more to charity as they envision a political race in a few years, while incumbent officials usually have open-and-shut government salaries and simple deductions.

If we are truly in the age of transparency, then disclosure of medical records seems just as important. After all, the republic has had a checkered record of presidents failing to disclose their illnesses both before and during their tenure. Woodrow Wilson suffered from hypertension, but concealed that ailment from the public through two elections — until a debilitating stroke left him incapacitated during his second term. Franklin Roosevelt never disclosed the full extent of his paralysis, his weak cardiovascular condition, or a number of other major health problems — all of which predated his presidency and would affect his performance while in office. The tanned, youthful John Kennedy was far sicker than we knew; full disclosure about his health might have made his pasty-faced rival, Dick Nixon, seem robust in comparison. In 1992 Paul Tsongas probably knew of his cancer’s recurrence but did not disclose it during the Democratic primaries.

Given all that history, and the media demands in 2008 that the septuagenarian cancer survivor John McCain should release thousands of pages of medical records for journalists’ perusal, why did not Barack Obama simply release his medical records? The Left had always trumpeted the desire for “full disclosure” and was probably right in wanting McCain to assure us that he was hale; but, again, why was Obama given a complete pass?

Most of us have had to release our undergraduate transcripts either when being considered for a job or when applying for post-baccalaureate education. Yet Barack Obama apparently does not wish the information about his college career known either. Is he afraid that we will learn that his Occidental and Columbia transcripts were as dismal as was John McCain’s Naval Academy ranking, near the bottom of his class? But whereas the media frowned upon McCain’s carousing undergraduate days, suggesting that they might prove a harbinger of an unpredictable presidency, they were content with blissful ignorance about Obama’s serial drug use as an undergraduate.

There is some reason to worry about Obama’s own transparency, given that he is the least vetted sitting president since John Kennedy, whose vita continues to expand in unwelcome ways nearly half a century after his death. A sympathetic biographer has revealed that the main incidents in President Obama’s life, as told in his own memoir, were largely exaggerated, if not fabricated altogether. We are still perplexed why Barack Obama for over decade permitted Kenya to be listed as his birthplace on his literary agent’s biography of him. Obama has not been forthcoming about his complex two-decade relationship with the odious Reverend Jeremiah Wright. We know now that the president was far more intimate with ex-terrorist Bill Ayers and felon Tony Rezko than he ever let on.

When questions come up about the president’s reluctance to release medical records or college transcripts, or the evidence that he was a fabulist in matters of his own autobiography, the Obama campaign’s defense is essentially that his three and a half years as president have established that he is competent; such past questions, his defenders say, are rendered irrelevant by his present performance. But neither the media nor Obama’s supporters extend that allowance to Romney, who, as head of the 2002 winter Olympic games and as a successful governor of Massachusetts, long ago proved that his lucrative business career had not led to malfeasance but rather to fiscal acumen put to good service for the state.

So how much do we wish to detour from the issues to know about the background of either candidate Romney or incumbent Obama? Some sort of compromise seems in order. If transparency is really what the public demands, and if these issues distract attention from a necessary debate over the economy, then in bipartisan fashion let us now demand full disclosure from both candidates: ten years of income tax returns from each, full and complete access for journalists to all known medical records of each, and complete release of all undergraduate and graduate grades, test scores, and other records.

Romney may not wish to release a decade’s worth of careful tax planning and investment that might reveal him to be more concerned about making money and keeping most of it than about outsourcing or foreign bank accounts. Obama may likewise be embarrassed over a prior undisclosed ailment, or a relatively unimpressive Occidental or Columbia record that would belie his media reputation as the “smartest” man ever to serve as president in the nation’s history. Perhaps for much of August we might hear that Romney had a gargantuan Swiss bank account, or more bankers in the Caribbean than we had surmised. Maybe Obama smoked more marijuana than he has admitted to or received lots of Cs and even some Ds in International Relations — grades that would make it almost impossible for most students to get into Harvard Law School.

But such embarrassments would pass by the end of the summer, and we, the wiser, could move on to the campaign debate over the economy. In short, it is time either to demand that both candidates put up everything — or to shut up and return to the debate over two radically different visions of how to fix an ailing America.