"Privacy" zealots want America to forgo intelligence capabilities during wartime.
By David B. Rivkin JR. and Lee A. Casey
Sunday, September 30, 2007 12:01 a.m.
Would any sane country purposefully limit its ability to spy on enemy communications in time of war? That is the question Congress must answer as it takes up reform of the Foreign Intelligence Surveillance Act. Privacy activists, civil libertarians and congressional Democrats argue that both foreign and domestic eavesdropping must be subject to judicial scrutiny and oversight, even if this means drastically reducing the amount of foreign intelligence information available to the government, without ever acknowledging the costs involved. It is time the American people had an open and honest debate on the relative importance of privacy and security.
FISA, of course, is the law regulating the government's interception of "electronic communications" for foreign intelligence purposes. Earlier this year the special FISA court narrowed dramatically the National Security Agency's ability to collect overseas intelligence under the law, so Congress passed a six-month amendment before its August recess to allow current surveillance programs to continue. That amendment should be made permanent.
When FISA was enacted in 1978, most of this foreign intelligence collection was accomplished by NSA satellites and "listening posts" located outside of the United States. That enabled that agency to acquire, without any judicial involvement, vast quantities of global communications. The fact that foreign targets contacted Americans was of no legal consequence. Even the strongest congressional proponents of FISA's regulation of surveillance activities recognized that intelligence gathering was a key executive function, and the U.S. needed as much foreign intelligence as possible. This bipartisan consensus--that FISA compliance should not impede foreign intelligence collection--was all the more notable coming amidst the congressional reaction to Watergate and at a time when the Cold War threats to national security, while formidable, did not require real-time surveillance of numerous nonstate actors.
Today, primarily because of the communications technology revolution, much of the same foreign intelligence information, focused on non-U.S. persons overseas, passes along U.S.-based fiber optics systems. Unfortunately, much of the Democratic congressional leadership says this new world requires more stringent regulation than in the past because of the risk to the privacy of innocent Americans. But this problem is one inherent in all surveillance schemes whether they're overseen by courts or not.
All suspects, whether garden-variety criminals or terrorists, whether surveilled with or without a warrant, invariably contact numerous innocents. Requiring the government to obtain a judicial order for all overseas surveillance whenever an American's communications might be intercepted will not solve this problem.
The government does utilize a series of "minimization" procedures governing how foreign intelligence information is handled to prevent its inappropriate use or disclosure. As explained by CIA Director Michael Hayden in 2006, referring to the post-Sept. 11 terrorist surveillance program before it was subjected to FISA: "if the U.S. person information isn't relevant [without foreign intelligence value], the data is suppressed." The fact that senior U.S. government officials (unlike their counterparts in other countries) do not routinely have access to the unredacted surveillance-generated information about American citizens, and that the system is operated largely by career civil servants, provides an additional layer of privacy protection.
Warrantless surveillance is also constitutional. The Fourth Amendment prohibits only "unreasonable" searches and seizures. Although today's privacy advocates routinely claim that warrantless searches are inherently unreasonable, that position is insupportable. The Supreme Court has repeatedly approved numerous warrantless searches, balancing the government's interests against the relevant privacy expectations. Thus drivers can be subjected to sobriety checkpoints and international travelers are liable to search at the border.
The key in such cases has generally been the presence or absence of a "reasonable expectation of privacy." If there is no reasonable expectation of privacy associated with a particular location or activity, then a warrantless search is not unreasonable. Whether Americans have a reasonable expectation that their international communications--which may be routed through any number of foreign countries and are routinely subject to capture by foreign intelligence services--will not be incidentally intercepted by the U.S. government is debatable. But foreign nationals communicating abroad have no reasonable expectation of privacy vis-à-vis the NSA simply because their conversations are electronically transmitted through American switching stations.
On the other side of the scale, of course, is the government's obligation to protect the American people. Because the U.S. faces a dispersed, shadowy, and ideologically committed enemy--in circumstances where defectors are rare and the CIA's ability to penetrate the hostile networks is extremely limited--the most proactive electronic surveillance operations are essential. Requiring judicial orders for the collection of foreign intelligence anytime an innocent American's communications may also be intercepted would cripple U.S. intelligence gathering. Obtaining orders against many foreign targets about which comparatively little may be known, including their true identities or the precise modalities of their involvement with jihadist entities, would be impossible.
The privacy advocates claim that surveilling without traditional warrants, albeit still with substantial judicial involvement, "purely" foreign-to-foreign communications is enough. But many of the NSA's most valuable overseas targets routinely contact Americans. Moreover, if the Democratic-leadership authored FISA reform--which requires judicial involvement once a foreign surveillance target reaches a certain number of communications with the U.S.--were to pass, every foreign terrorist and spymaster would communicate with the U.S. enough to be enrolled in the warrant-driven surveillance program. As a result, the only people overseas who could still be surveilled warrantlessly would be the ones with the least intelligence value.
The privacy advocates also criticize the NSA's efforts to collect vast quantities of information, claiming that more targeted, individual-specific surveillance is both more privacy-friendly and better protects America's safety. However, unlike the Cold War-era--when the NSA was focused largely on a few state entities, and had a pretty good idea of who the targets were--today targeted surveillance alone is not enough. Thousands of individuals participate in various ways in jihadist activities, and even more individuals possess valuable information about them. All of them seek to blend into society, benefiting from the anonymity of modern life and ease of travel and communications. Because their behavior differs in subtle ways from the conduct of law-abiding citizens around them, NSA-led broad surveillance, backed up by various pattern-recognition programs, can identify the right targets.
Indeed, privacy advocates seek to ban the NSA's overseas-focused broad surveillance programs--and require warrants whenever overseas targets have a number of contacts with the U.S.--precisely to decrease dramatically the total number of foreigners tracked by the NSA. Their logic is unimpeachable--the fewer foreign targets are reached by the NSA, the fewer innocent Americans would be caught up in the surveillance net. But this fervent commitment to protecting the privacy of Americans from all intrusions comes at a very high cost; for the first time in history, the U.S. is asked to collect less intelligence about the enemy while prosecuting a war.
Those who want to subject all government surveillance activities to a warrant requirement should honestly acknowledge that this approach would dramatically shrink the stream of foreign intelligence available.
Let's be clear here: Privacy is an important value. American society cannot afford, however, to elevate privacy concerns beyond all other considerations. Being suspicious about governmental power is consistent with our constitutional values--the Framers certainly were so inclined--but being paranoid about one's own government is not.
Sunday, September 30, 2007
Why Liberals Make Atrocious Parents
By Kevin McCullough
Sunday, September 30, 2007
The leading liberals in America gave frightening clear evidence this week, that not only do they lack the wisdom to run the nation, but that by their own words they do not even understand the priorities of good parenting. The position of "parent" is God granted, yet they shirk with great disdain the desire to give their children the basic wisdom of life. In doing so they demonstrate clearly that they are contributing to one of our nation's greatest deficits - the discernment and critical thinking skills of the next generation.
In the New Hampshire Democratic debate this week, the veil was pulled back on more than just the Democratic party's great lie about their desire to bring the troops home in the global war on terror (none would even commit to doing it before the end of their first term) but perhaps more importantly their twisted views on family, sex, and parental responsibility were also highlighted.
Tim Russert asked the three front-runners for the Democratic nomination as to their comfort level of teaching a homosexual story of two boys consummating their lust for each other to children in the second grade. They all agreed they would support the teaching of such behavior, though they attempted to hem, haw, and confuse the issue mumbling about parents' involvement.
John Edwards: "Yes absolutely..." (He would support the teaching of the story to second graders.) "I want my children to understand everything about the difficulties that gay and lesbian couples are faced with every day, the discrimination that they’re faced with every single day of their lives."
Hillary Clinton: "Obviously, it is better to try to … help your children understand the many differences that are in the world. … And that goes far beyond sexual orientation. So I think that this issue of gays and lesbians and their rights will remain an important one in our country."
Barack Obama: "The fact is, my 9-year-old and my 6-year-old I think are already aware that there are same-sex couples... One of the things I want to communicate to my children is not to be afraid of people who are different. …. One of the things I think the next president has to do is stop fanning people’s fears. If we spend all our time feeding the American people fear and conflict and division, then they become fearful and conflicted and divided."
Obama in fact confirmed that his wife had already taken the opportunity to sit his six and nine year old daughters down to discuss same-sex behavior and why some radical same-sex couples believe society should redefine the God-sanctioned institution of marriage because of it.
But it was John Edwards that summarized for liberals everywhere what they actually believe: "I don’t want to make that decision on behalf of my children. I want my children to be able to make that decision on behalf of themselves, and I want them to be exposed to all the information... even in second grade to be exposed to all those possibilities, because I don’t want to impose my view. Nobody made me God."
Edwards gave voice to words that liberals have thought and practiced for years.
Liberals, by the strictest understanding of the definition of the word, believe in lack of restraint, defying of limits, and excess - whether it's taxes, education, or sexual practice. Truth can never be known and all focus must be given to the unknowable.
In and of itself the term "liberal" isn't necessarily a bad one. For instance in the scriptures we are instructed to be liberal with generosity for those in need, forgiveness for those who repent, and mercy for those who suffer. But Clinton, Obama, and Edwards have taken it far beyond that.
Liberals today mean it as an excuse to wipe away other important elements of behavior like self-control, purity, moderation, and even delayed gratification. It's my opinion that the lost virtue of restraint has in fact become one of our nation's most important deficits - so much so that I devoted an entire chapter to the idea in MuscleHead Revolution.
But with these ideas liberals have even excused themselves of performing the task God gave them uniquely. It IS a parent's job to teach a child how to think, the framework of what to believe, and to equip them to critique their own actions and the actions of others for even some very basic reasons - like self preservation.
Conservative parents teach disciplined behavior so that in their children's private world they do not bring harm to themselves, and in public they do not bring harassment, discomfort or harm to others. The benefit of learning to be quiet at some times and places, helps a child to enjoy the exuberance of playtime later. Teaching a child not to touch everything they see, gives them self-control and prevents them from breaking things they should not have grabbed in the first place. Saying "no" when they reach for a hot pan on the stove, may seem rather harsh, scolding, and even angry - but in the end it has saved them from immense pain!
If you want to see this at work in the real world - take ten minutes to go randomly interview any girl who works in women's retail today. The hellions that liberal moms bring into their store - and immediately lose track of the moment they begin trying things on are significantly different than the children who have been taught to stand quietly and wait until they are home to run, wrestle, hide, seek, laugh, and play.
John Edwards, though he represents the thinking of liberal mentality across the board, could not be more mistaken.
True nobody made him 'God' (and we all breathe easier for that.) But God did very much make him, and more importantly He made him a representative to his children, to teach, to instruct, to guide, and to help grow his children into fully functional and thoughtful adults who will then be able to do the same for their children in the days to come.
To not exercise the responsibility of teaching his children, or even more dangerously in Barack Obama's case of willfully teaching his children behavior and immoral justification, liberals are at best proving that they do not have the critical discernment needed to recognize the difference between right and wrong. At worst they are demonstrating negligent or intentional contempt for their children and society. And if they are that confused about something so basic as instructing their children, how will THEY be equipped when weighing the balance of good and evil in the world and nation they hope to lead?
In exposing their thinking on something so simple to us they confirmed that I would never trust them to baby-sit my own child, therefore how on earth can they be given oversight of the free world?
Sunday, September 30, 2007
The leading liberals in America gave frightening clear evidence this week, that not only do they lack the wisdom to run the nation, but that by their own words they do not even understand the priorities of good parenting. The position of "parent" is God granted, yet they shirk with great disdain the desire to give their children the basic wisdom of life. In doing so they demonstrate clearly that they are contributing to one of our nation's greatest deficits - the discernment and critical thinking skills of the next generation.
In the New Hampshire Democratic debate this week, the veil was pulled back on more than just the Democratic party's great lie about their desire to bring the troops home in the global war on terror (none would even commit to doing it before the end of their first term) but perhaps more importantly their twisted views on family, sex, and parental responsibility were also highlighted.
Tim Russert asked the three front-runners for the Democratic nomination as to their comfort level of teaching a homosexual story of two boys consummating their lust for each other to children in the second grade. They all agreed they would support the teaching of such behavior, though they attempted to hem, haw, and confuse the issue mumbling about parents' involvement.
John Edwards: "Yes absolutely..." (He would support the teaching of the story to second graders.) "I want my children to understand everything about the difficulties that gay and lesbian couples are faced with every day, the discrimination that they’re faced with every single day of their lives."
Hillary Clinton: "Obviously, it is better to try to … help your children understand the many differences that are in the world. … And that goes far beyond sexual orientation. So I think that this issue of gays and lesbians and their rights will remain an important one in our country."
Barack Obama: "The fact is, my 9-year-old and my 6-year-old I think are already aware that there are same-sex couples... One of the things I want to communicate to my children is not to be afraid of people who are different. …. One of the things I think the next president has to do is stop fanning people’s fears. If we spend all our time feeding the American people fear and conflict and division, then they become fearful and conflicted and divided."
Obama in fact confirmed that his wife had already taken the opportunity to sit his six and nine year old daughters down to discuss same-sex behavior and why some radical same-sex couples believe society should redefine the God-sanctioned institution of marriage because of it.
But it was John Edwards that summarized for liberals everywhere what they actually believe: "I don’t want to make that decision on behalf of my children. I want my children to be able to make that decision on behalf of themselves, and I want them to be exposed to all the information... even in second grade to be exposed to all those possibilities, because I don’t want to impose my view. Nobody made me God."
Edwards gave voice to words that liberals have thought and practiced for years.
Liberals, by the strictest understanding of the definition of the word, believe in lack of restraint, defying of limits, and excess - whether it's taxes, education, or sexual practice. Truth can never be known and all focus must be given to the unknowable.
In and of itself the term "liberal" isn't necessarily a bad one. For instance in the scriptures we are instructed to be liberal with generosity for those in need, forgiveness for those who repent, and mercy for those who suffer. But Clinton, Obama, and Edwards have taken it far beyond that.
Liberals today mean it as an excuse to wipe away other important elements of behavior like self-control, purity, moderation, and even delayed gratification. It's my opinion that the lost virtue of restraint has in fact become one of our nation's most important deficits - so much so that I devoted an entire chapter to the idea in MuscleHead Revolution.
But with these ideas liberals have even excused themselves of performing the task God gave them uniquely. It IS a parent's job to teach a child how to think, the framework of what to believe, and to equip them to critique their own actions and the actions of others for even some very basic reasons - like self preservation.
Conservative parents teach disciplined behavior so that in their children's private world they do not bring harm to themselves, and in public they do not bring harassment, discomfort or harm to others. The benefit of learning to be quiet at some times and places, helps a child to enjoy the exuberance of playtime later. Teaching a child not to touch everything they see, gives them self-control and prevents them from breaking things they should not have grabbed in the first place. Saying "no" when they reach for a hot pan on the stove, may seem rather harsh, scolding, and even angry - but in the end it has saved them from immense pain!
If you want to see this at work in the real world - take ten minutes to go randomly interview any girl who works in women's retail today. The hellions that liberal moms bring into their store - and immediately lose track of the moment they begin trying things on are significantly different than the children who have been taught to stand quietly and wait until they are home to run, wrestle, hide, seek, laugh, and play.
John Edwards, though he represents the thinking of liberal mentality across the board, could not be more mistaken.
True nobody made him 'God' (and we all breathe easier for that.) But God did very much make him, and more importantly He made him a representative to his children, to teach, to instruct, to guide, and to help grow his children into fully functional and thoughtful adults who will then be able to do the same for their children in the days to come.
To not exercise the responsibility of teaching his children, or even more dangerously in Barack Obama's case of willfully teaching his children behavior and immoral justification, liberals are at best proving that they do not have the critical discernment needed to recognize the difference between right and wrong. At worst they are demonstrating negligent or intentional contempt for their children and society. And if they are that confused about something so basic as instructing their children, how will THEY be equipped when weighing the balance of good and evil in the world and nation they hope to lead?
In exposing their thinking on something so simple to us they confirmed that I would never trust them to baby-sit my own child, therefore how on earth can they be given oversight of the free world?
Saturday, September 29, 2007
Iran: The Unacceptable Risk to Stability
By Charles Krauthammer
Friday, September 28, 2007
WASHINGTON -- Ahmadinejad at Columbia provided the entertainment, but Sarkozy at the U.N. provided the substance. On the largest possible stage -- the U.N. General Assembly -- President Nicolas Sarkozy put Iran on notice. His predecessor, Jacques Chirac, had said that France could live with an Iranian nuclear bomb. Sarkozy said that France cannot. He declared Iran's nuclear ambitions "an unacceptable risk to stability in the region and in the world."
His foreign minister, Bernard Kouchner, had earlier said that the world faces two choices -- successful diplomacy to stop Iran's nuclear program or war. And Sarkozy himself has no great hopes for the Security Council, where China and Russia are blocking any effective action against Iran. He does hope to get the European Union to join the U.S. in imposing serious sanctions.
"Weakness and renunciation do not lead to peace," he warned. "They lead to war." This warning about appeasement was intended particularly for Germany, which for commercial reasons has been resisting U.S. pressure to support effective sanctions.
Sarkozy is no American lapdog. Like every Fifth Republic president, he begins with the notion of French exceptionalism. But whereas traditional Gaullism tended to define French grandeur as establishing a counterweight to American power, Sarkozy is not adverse to seeing French assertiveness exercised in conjunction with the United States. As Kouchner put it, "permanent anti-Americanism" is "a tradition we are working to overcome."
This French about-face creates a crucial shift in the balance of forces within Europe. The East Europeans are naturally pro-American for reasons of history (fresh memories of America's role in defeating their Soviet occupiers) and geography (physical proximity to a newly revived and aggressive Russia). Western Europe is intrinsically wary of American power and culturally anti-American by reflex. France's change from Chirac to Sarkozy, from Foreign Minister Dominique de Villepin (who actively lobbied Third World countries to oppose America on Iraq) to Kouchner (who supported the U.S. invasion on humanitarian grounds) represents an enormous shift in Old Europe's relationship to the U.S.
Britain is a natural ally. Germany, given its history, is more follower than leader. France can define European policy, and Sarkozy intends to.
The French flip is only one part of the changing landscape that has given new life to Bush's Iran and Iraq policies in the waning months of his administration. The mood in Congress also has significantly shifted.
Just this week, the House overwhelmingly passed a resolution calling for very strong sanctions on Iran and urging the administration to designate Iran's Revolutionary Guards a terrorist entity. A similar measure passed the Senate Wednesday by 76-22, declaring that it is "a critical national interest of the United States" to prevent Iran from using Shiite militias inside Iraq to subvert the U.S.-backed government in Baghdad.
A few months ago, the question was: Will the Democratic Congress force a withdrawal from Iraq? Today the question in Congress is: What can be done to achieve success in Iraq -- most specifically, by countering Iran, which is intent on seeing us fail?
This change in mood and subject is entirely the result of changes on the ground. It takes time for reality to seep into a Washington debate. But after the Petraeus-Crocker testimony, the reality of the relative success of our new counterinsurgency strategy -- and the renewed possibility of ultimate success in Iraq -- became no longer deniable.
And that reality is reflected even in the rhetoric of Hillary Clinton, the most politically sophisticated of the Democratic presidential candidates. She does vote against war funding in order to alter the president's policy (and to appease the left), but that is as a senator. When asked what she would do as president, she carefully hedges. She says that it would depend on the situation on the ground at the time. For example, whether our alliance with the Sunni tribes will have succeeded in defeating al-Qaeda in Iraq. But when asked by ABC News if she would bring U.S. troops home by January 2013, she refused to "get into hypotheticals and make pledges."
Bush's presidency -- and foreign policy -- were pronounced dead on the morning after the 2006 election. Not so. France is going to join us in a last-ditch effort to find a nonmilitary solution to the Iranian issue. And on Iraq, the relative success of the surge has won President Bush the leeway to continue the Petraeus counterinsurgency strategy to the end of his term. Congress, and realistic Democrats, are finally beginning to think seriously about making that strategy succeed and planning for what comes after.
Friday, September 28, 2007
WASHINGTON -- Ahmadinejad at Columbia provided the entertainment, but Sarkozy at the U.N. provided the substance. On the largest possible stage -- the U.N. General Assembly -- President Nicolas Sarkozy put Iran on notice. His predecessor, Jacques Chirac, had said that France could live with an Iranian nuclear bomb. Sarkozy said that France cannot. He declared Iran's nuclear ambitions "an unacceptable risk to stability in the region and in the world."
His foreign minister, Bernard Kouchner, had earlier said that the world faces two choices -- successful diplomacy to stop Iran's nuclear program or war. And Sarkozy himself has no great hopes for the Security Council, where China and Russia are blocking any effective action against Iran. He does hope to get the European Union to join the U.S. in imposing serious sanctions.
"Weakness and renunciation do not lead to peace," he warned. "They lead to war." This warning about appeasement was intended particularly for Germany, which for commercial reasons has been resisting U.S. pressure to support effective sanctions.
Sarkozy is no American lapdog. Like every Fifth Republic president, he begins with the notion of French exceptionalism. But whereas traditional Gaullism tended to define French grandeur as establishing a counterweight to American power, Sarkozy is not adverse to seeing French assertiveness exercised in conjunction with the United States. As Kouchner put it, "permanent anti-Americanism" is "a tradition we are working to overcome."
This French about-face creates a crucial shift in the balance of forces within Europe. The East Europeans are naturally pro-American for reasons of history (fresh memories of America's role in defeating their Soviet occupiers) and geography (physical proximity to a newly revived and aggressive Russia). Western Europe is intrinsically wary of American power and culturally anti-American by reflex. France's change from Chirac to Sarkozy, from Foreign Minister Dominique de Villepin (who actively lobbied Third World countries to oppose America on Iraq) to Kouchner (who supported the U.S. invasion on humanitarian grounds) represents an enormous shift in Old Europe's relationship to the U.S.
Britain is a natural ally. Germany, given its history, is more follower than leader. France can define European policy, and Sarkozy intends to.
The French flip is only one part of the changing landscape that has given new life to Bush's Iran and Iraq policies in the waning months of his administration. The mood in Congress also has significantly shifted.
Just this week, the House overwhelmingly passed a resolution calling for very strong sanctions on Iran and urging the administration to designate Iran's Revolutionary Guards a terrorist entity. A similar measure passed the Senate Wednesday by 76-22, declaring that it is "a critical national interest of the United States" to prevent Iran from using Shiite militias inside Iraq to subvert the U.S.-backed government in Baghdad.
A few months ago, the question was: Will the Democratic Congress force a withdrawal from Iraq? Today the question in Congress is: What can be done to achieve success in Iraq -- most specifically, by countering Iran, which is intent on seeing us fail?
This change in mood and subject is entirely the result of changes on the ground. It takes time for reality to seep into a Washington debate. But after the Petraeus-Crocker testimony, the reality of the relative success of our new counterinsurgency strategy -- and the renewed possibility of ultimate success in Iraq -- became no longer deniable.
And that reality is reflected even in the rhetoric of Hillary Clinton, the most politically sophisticated of the Democratic presidential candidates. She does vote against war funding in order to alter the president's policy (and to appease the left), but that is as a senator. When asked what she would do as president, she carefully hedges. She says that it would depend on the situation on the ground at the time. For example, whether our alliance with the Sunni tribes will have succeeded in defeating al-Qaeda in Iraq. But when asked by ABC News if she would bring U.S. troops home by January 2013, she refused to "get into hypotheticals and make pledges."
Bush's presidency -- and foreign policy -- were pronounced dead on the morning after the 2006 election. Not so. France is going to join us in a last-ditch effort to find a nonmilitary solution to the Iranian issue. And on Iraq, the relative success of the surge has won President Bush the leeway to continue the Petraeus counterinsurgency strategy to the end of his term. Congress, and realistic Democrats, are finally beginning to think seriously about making that strategy succeed and planning for what comes after.
Labels:
America's Role,
Democrats,
Europe,
France,
Iran,
Liberals,
Policy,
United Nations
The Long Arm of Iran
The top mullahs have been complicit in terror attacks.
By Dan Senor
Saturday, September 29, 2007 12:01 a.m.
"I think it would be almost inconceivable that Iran would commit suicide by launching one or two missiles of any kind against the nation of Israel."
--Jimmy Carter, speaking at Emory University, Sept. 19, 2007
On March 17, 1992, a suicide bomber crashed an explosive-filled truck into a building filled with Israelis in Buenos Aires. The bombing was so powerful that the destruction covered several city blocks--29 innocents were killed and hundreds more were injured. This occurred more than 8,000 miles from Tehran. Two years later, on July 18, 1994, Buenos Aires was again hit with a terror attack. This time the target was the Jewish community center in the center of the city--85 were killed.
Argentina was, understandably, rattled. The government launched a full-scale investigation. One of the key officials assigned to it was Miguel Angel Toma (later appointed by then President Eduardo Duhalde as secretary of intelligence from 2002-03). Mr. Toma is not a warmonger. And he did not approach his job with any ideological ax to grind. He concluded not only that Hezbollah carried out the attacks in Argentina, but that at least one of them was planned in Iran at the highest levels of the Iranian government, aided by a sophisticated sleeper-cell network in Latin America. He also concluded that the attacks were strategically aimed at punishing the Argentinean government.
Iran and Argentina had had commercial ties throughout the 1970s and '80s valued at hundreds of millions of dollars, and had entered into agreements to jointly pursue nuclear energy and missile programs. But by 1989, a new civilian government headed by Carlos Menem had come to power and canceled its prior agreements with Iran. As far as Iran was concerned, it was time to punish Argentina for the reversal and send a warning shot to the rest of Latin America. And by focusing on soft targets in Jewish communities, the operations would serve an additional objective: demonstrating to Israel that Hezbollah could hit anywhere at anytime.
Mr. Toma says--based on Argentina's cooperation with intelligence agencies around the world--he's certain of the date, location and participants in the decision by the Iranian government to execute the second Buenos Aires attack. He pinpoints it to a meeting that occurred in the holy Iranian city of Mashhad on Aug. 14, 1993. It was presided over by the Ayatollah Ali Khamenei, then and now the Supreme Leader of Iran; and the Iranian president at the time, Ali-Akbar Rafsanjani. Following this meeting, Mr. Toma believes that Iran began working with Hezbollah in the planning, funding and staffing of the 1994 attack in Argentina. Indeed, Argentina has issued warrants for nine Hezbollah operatives and Iranian leaders, including Mr. Rafsanjani. Nobody has been arrested.
The Argentinean case reminds us of four important points.
First, we must reconsider the applicability of Cold War-style deterrence. Its central argument is this: While it would be preferable that Iran not go nuclear, the history of the Cold War demonstrates that the possession of nukes creates a balance of power, and thus makes the possibility of nuclear war extremely unlikely. Representing the pro-deterrence school, Stephen Biddle of the Council on Foreign Relations says, "We've lived with Iran as a terror threat for a generation. Iran has a return address, and states with a return address can be retaliated against."
This misses the point. Even if Iran never fires a nuke or transfers one to a terrorist group, its possession of nukes would enable it to escalate support for terrorist proxies, allowing it to dominate the region and threaten moderate regimes. Who would be prepared to retaliate against a future Buenos Aires terror attack if we knew that the "return address" was home to a nuclear weapon?
Second, U.S. officials are deeply concerned that Tehran would not even have to build a complete bomb to transform the balance of power. It would just have to make the case that it could complete development on short notice. "For their political needs, that would be enough," says Gary Samore, a nonproliferation official in the Clinton administration.
Third, Mr. Rafasanjani continues to be described in the Western media as a leading Iranian "moderate." If Mr. Toma is correct, this "moderate" was intimately involved in the planning of the Argentina bombings. And he has ambitions to succeed President Ahmadinejad.
Fourth, according to Mr. Toma, the Supreme Leader Ayatollah Ali Khamenei authorized the Buenos Aires attacks. This is important because many analysts today argue that, as scary as President Ahmadinejad sounds, he is not really in charge in Tehran--the true "decider" is the Supreme Leader. Well if he is, then we should in fact be doubly concerned.
Iran is not the Soviet Union and the post-9/11 struggle is not the Cold War. The deterrence camp is willing to stand by as Iran develops nuclear weapons, presumably on the model that Iran will eventually collapse as the Soviet Union did. But the Argentinean case demonstrates what Tehran was willing and able to do when it had no nuclear umbrella. If, as the 9/11 Commission Report argues, the U.S. suffered from a "failure of imagination" regarding how far terrorists would go, a nuclear Iran risks encouraging the terrorist imagination to take another quantum leap.
By Dan Senor
Saturday, September 29, 2007 12:01 a.m.
"I think it would be almost inconceivable that Iran would commit suicide by launching one or two missiles of any kind against the nation of Israel."
--Jimmy Carter, speaking at Emory University, Sept. 19, 2007
On March 17, 1992, a suicide bomber crashed an explosive-filled truck into a building filled with Israelis in Buenos Aires. The bombing was so powerful that the destruction covered several city blocks--29 innocents were killed and hundreds more were injured. This occurred more than 8,000 miles from Tehran. Two years later, on July 18, 1994, Buenos Aires was again hit with a terror attack. This time the target was the Jewish community center in the center of the city--85 were killed.
Argentina was, understandably, rattled. The government launched a full-scale investigation. One of the key officials assigned to it was Miguel Angel Toma (later appointed by then President Eduardo Duhalde as secretary of intelligence from 2002-03). Mr. Toma is not a warmonger. And he did not approach his job with any ideological ax to grind. He concluded not only that Hezbollah carried out the attacks in Argentina, but that at least one of them was planned in Iran at the highest levels of the Iranian government, aided by a sophisticated sleeper-cell network in Latin America. He also concluded that the attacks were strategically aimed at punishing the Argentinean government.
Iran and Argentina had had commercial ties throughout the 1970s and '80s valued at hundreds of millions of dollars, and had entered into agreements to jointly pursue nuclear energy and missile programs. But by 1989, a new civilian government headed by Carlos Menem had come to power and canceled its prior agreements with Iran. As far as Iran was concerned, it was time to punish Argentina for the reversal and send a warning shot to the rest of Latin America. And by focusing on soft targets in Jewish communities, the operations would serve an additional objective: demonstrating to Israel that Hezbollah could hit anywhere at anytime.
Mr. Toma says--based on Argentina's cooperation with intelligence agencies around the world--he's certain of the date, location and participants in the decision by the Iranian government to execute the second Buenos Aires attack. He pinpoints it to a meeting that occurred in the holy Iranian city of Mashhad on Aug. 14, 1993. It was presided over by the Ayatollah Ali Khamenei, then and now the Supreme Leader of Iran; and the Iranian president at the time, Ali-Akbar Rafsanjani. Following this meeting, Mr. Toma believes that Iran began working with Hezbollah in the planning, funding and staffing of the 1994 attack in Argentina. Indeed, Argentina has issued warrants for nine Hezbollah operatives and Iranian leaders, including Mr. Rafsanjani. Nobody has been arrested.
The Argentinean case reminds us of four important points.
First, we must reconsider the applicability of Cold War-style deterrence. Its central argument is this: While it would be preferable that Iran not go nuclear, the history of the Cold War demonstrates that the possession of nukes creates a balance of power, and thus makes the possibility of nuclear war extremely unlikely. Representing the pro-deterrence school, Stephen Biddle of the Council on Foreign Relations says, "We've lived with Iran as a terror threat for a generation. Iran has a return address, and states with a return address can be retaliated against."
This misses the point. Even if Iran never fires a nuke or transfers one to a terrorist group, its possession of nukes would enable it to escalate support for terrorist proxies, allowing it to dominate the region and threaten moderate regimes. Who would be prepared to retaliate against a future Buenos Aires terror attack if we knew that the "return address" was home to a nuclear weapon?
Second, U.S. officials are deeply concerned that Tehran would not even have to build a complete bomb to transform the balance of power. It would just have to make the case that it could complete development on short notice. "For their political needs, that would be enough," says Gary Samore, a nonproliferation official in the Clinton administration.
Third, Mr. Rafasanjani continues to be described in the Western media as a leading Iranian "moderate." If Mr. Toma is correct, this "moderate" was intimately involved in the planning of the Argentina bombings. And he has ambitions to succeed President Ahmadinejad.
Fourth, according to Mr. Toma, the Supreme Leader Ayatollah Ali Khamenei authorized the Buenos Aires attacks. This is important because many analysts today argue that, as scary as President Ahmadinejad sounds, he is not really in charge in Tehran--the true "decider" is the Supreme Leader. Well if he is, then we should in fact be doubly concerned.
Iran is not the Soviet Union and the post-9/11 struggle is not the Cold War. The deterrence camp is willing to stand by as Iran develops nuclear weapons, presumably on the model that Iran will eventually collapse as the Soviet Union did. But the Argentinean case demonstrates what Tehran was willing and able to do when it had no nuclear umbrella. If, as the 9/11 Commission Report argues, the U.S. suffered from a "failure of imagination" regarding how far terrorists would go, a nuclear Iran risks encouraging the terrorist imagination to take another quantum leap.
The UAW's Awakening
A union shows a new awareness of global competition.
Wall Street Journal
Saturday, September 29, 2007 12:01 a.m.
This week's deal between General Motors and the United Auto Workers is being hailed as a new era for Detroit, and for once that advertising may be justified. The UAW in particular made historic concessions that show a new awareness of global competition. What's less encouraging is how much this reality-based compromise still contrasts with the policies that unions and their political friends are promoting in the unreal world of Washington, D.C.
Our friends in the AFL-CIO often think we're too critical, but we're not responsible for taking union membership down to 7.4% of the non-government American labor force last year. (See nearby chart.) The reality of a dynamic world economy did that, assisted by the failure to adapt by union leaders and corporate managers. These columns support collective bargaining, and our belief has long been that if a company's workers vote to join a union, they and the company deserve what they get.
The problem with unions is not all that dissimilar to that posed by entrenched management: Once they win comfortable contracts, they often become impediments to the kind of innovation and flexibility essential to success in today's economy. So in the name of "job security," they undermine a company's--or a nation's--competitiveness. The result, over time, is less job security for everyone, especially the union workforce. There's no better example of this than GM, where the UAW now represents about 74,000 hourly workers, compared to 246,000 in 1994. Some security.
The new GM-UAW contract is a belated recognition that the choice has now become change, or Chapter 11. Under the deal, wages are frozen, save for bonuses and some lump-sum payments. GM in turn promises to invest in American plants with UAW workers, though of course it will also keep investing abroad.
In what seems to be the most creative stroke, GM will pay some $35 billion toward a new health-care trust fund to be administered by the union. That's a big initial cash flow, but it means the company can divest itself of some $50 billion in long-term liabilities, which would only have grown as health-care costs rose and retirees lived longer. Investors loved it, driving up GM stock by around 7% for the week.
The UAW now gains ownership of its members' health-care resources, in effect becoming a financial manager of a giant Health Savings Account for auto workers. If the union is creative, it will rethink its coverage plans, using the new generation of consumer-driven health-care options (such as personal health savings accounts) to encourage and reward more careful spending by beneficiaries. UAW President Ron Gettelfinger has told his members the trust fund will last 80 years, and the union's job now is to make sure it does. A similar arrangement at Caterpillar Inc. didn't work because the money ran out in six years.
This new Treaty of Detroit in the marketplace is all the more notable when you consider how little the union political agenda has changed. The AFL-CIO famously split in 2005 over the priority of organizing over politics. But organized labor's share of the private workforce has kept falling.
We had a friendly visit not too long ago with Andy Stern, the Service Employees International Union President and perhaps the most successful modern labor leader. He is a shrewd man, but his main message seemed to be that union salvation lies in America adopting the work rules and income redistribution of Europe. He says companies need to pass their health-care costs onto government, meaning taxpayers. And while he recognizes that unions can't secede from the global economy, the trade rules need to be changed--that is, restricted or managed--so that the pace of change is less disruptive and wealth more equally shared.
With Democrats now running Congress, and ahead in the Presidential polls, Mr. Stern and his union mates are closer than they've been in decades to seeing that agenda implemented. But they also reveal their own lack of faith in the appeal of unions when they support a ban on secret-ballot elections at work sites. And of course they still benefit--unlike anyone else in American politics--from being able to coerce the payment of dues.
The larger irony is that Europe is now learning the hard way that Mr. Stern's "social contract" is itself deeply flawed. French President Nicolas Sarkozy was elected this year in part because he acknowledged that even France can't sustain the French model any longer. Health-care expenses represent a huge chunk of the tax burden in France, where restrictive work rules and such union demands as the 35-hour week have led to far higher joblessness and far less prosperity than in the U.S.
Mr. Sarkozy is now pushing American-style reforms precisely when Mr. Stern and Democrats are promoting French policies. Our guess is that economic reality will in the end limit Mr. Stern's political ambitions in the same way that global competition has finally awakened the UAW.
Wall Street Journal
Saturday, September 29, 2007 12:01 a.m.
This week's deal between General Motors and the United Auto Workers is being hailed as a new era for Detroit, and for once that advertising may be justified. The UAW in particular made historic concessions that show a new awareness of global competition. What's less encouraging is how much this reality-based compromise still contrasts with the policies that unions and their political friends are promoting in the unreal world of Washington, D.C.
Our friends in the AFL-CIO often think we're too critical, but we're not responsible for taking union membership down to 7.4% of the non-government American labor force last year. (See nearby chart.) The reality of a dynamic world economy did that, assisted by the failure to adapt by union leaders and corporate managers. These columns support collective bargaining, and our belief has long been that if a company's workers vote to join a union, they and the company deserve what they get.
The problem with unions is not all that dissimilar to that posed by entrenched management: Once they win comfortable contracts, they often become impediments to the kind of innovation and flexibility essential to success in today's economy. So in the name of "job security," they undermine a company's--or a nation's--competitiveness. The result, over time, is less job security for everyone, especially the union workforce. There's no better example of this than GM, where the UAW now represents about 74,000 hourly workers, compared to 246,000 in 1994. Some security.
The new GM-UAW contract is a belated recognition that the choice has now become change, or Chapter 11. Under the deal, wages are frozen, save for bonuses and some lump-sum payments. GM in turn promises to invest in American plants with UAW workers, though of course it will also keep investing abroad.
In what seems to be the most creative stroke, GM will pay some $35 billion toward a new health-care trust fund to be administered by the union. That's a big initial cash flow, but it means the company can divest itself of some $50 billion in long-term liabilities, which would only have grown as health-care costs rose and retirees lived longer. Investors loved it, driving up GM stock by around 7% for the week.
The UAW now gains ownership of its members' health-care resources, in effect becoming a financial manager of a giant Health Savings Account for auto workers. If the union is creative, it will rethink its coverage plans, using the new generation of consumer-driven health-care options (such as personal health savings accounts) to encourage and reward more careful spending by beneficiaries. UAW President Ron Gettelfinger has told his members the trust fund will last 80 years, and the union's job now is to make sure it does. A similar arrangement at Caterpillar Inc. didn't work because the money ran out in six years.
This new Treaty of Detroit in the marketplace is all the more notable when you consider how little the union political agenda has changed. The AFL-CIO famously split in 2005 over the priority of organizing over politics. But organized labor's share of the private workforce has kept falling.
We had a friendly visit not too long ago with Andy Stern, the Service Employees International Union President and perhaps the most successful modern labor leader. He is a shrewd man, but his main message seemed to be that union salvation lies in America adopting the work rules and income redistribution of Europe. He says companies need to pass their health-care costs onto government, meaning taxpayers. And while he recognizes that unions can't secede from the global economy, the trade rules need to be changed--that is, restricted or managed--so that the pace of change is less disruptive and wealth more equally shared.
With Democrats now running Congress, and ahead in the Presidential polls, Mr. Stern and his union mates are closer than they've been in decades to seeing that agenda implemented. But they also reveal their own lack of faith in the appeal of unions when they support a ban on secret-ballot elections at work sites. And of course they still benefit--unlike anyone else in American politics--from being able to coerce the payment of dues.
The larger irony is that Europe is now learning the hard way that Mr. Stern's "social contract" is itself deeply flawed. French President Nicolas Sarkozy was elected this year in part because he acknowledged that even France can't sustain the French model any longer. Health-care expenses represent a huge chunk of the tax burden in France, where restrictive work rules and such union demands as the 35-hour week have led to far higher joblessness and far less prosperity than in the U.S.
Mr. Sarkozy is now pushing American-style reforms precisely when Mr. Stern and Democrats are promoting French policies. Our guess is that economic reality will in the end limit Mr. Stern's political ambitions in the same way that global competition has finally awakened the UAW.
Labels:
Capitalism,
Economy,
Europe,
France,
Ignorance,
Labor (Unions),
Liberals,
Policy
Friday, September 28, 2007
When Liberals Attack
Violent movies don't sell when the heroes are racked with guilt.
By Kyle Smith
Friday, September 28, 2007 12:01 a.m.
What has come over liberals? Suddenly they've turned bloodthirsty. And they're not just lobbing "Daily Show" coffee mugs or brandishing the rusty business end of their DEAN 2004 campaign pins. Liberals are locked, loaded and licensed to kill--at the movies.
The new Jodie Foster film, "The Brave One," is the latest in a string of left-wing Bush-era movies about violence. These films--which range from popcorn flicks (the "X-Men" series, "The Hills Have Eyes 2") to more ambitious works and Oscar nominees ("A History of Violence," "V for Vendetta," "Munich," "Blood Diamond")--so deeply entangle killing with liberal idealism, though, that at times their scripts are as muddled as EEOC directives or U.N. rules of engagement. For all of the critical acclaim that attended most of these films, few are as effective as "Dirty Harry" or "Death Wish."
In "The Brave One," for instance, possibly the first vigilante movie to feature a Sarah McLachlan soundtrack, a New York radio personality (Jodie Foster) specializes in monologues about the sounds of the city. She speaks with a maximum of NPR narcoleptic condescension, chewing each syllable of her airy drivel ("Are we going to have to construct an imaginary city to house our memories?") as if reading to a toddler out of "My First Book of Cultural Anthropology." Strangely, however, she is not the bad guy.
After her fiancé (apparently a Briton of Indian descent) gets killed when both of them are jumped by vicious white youths, Ms. Foster's character spends weeks in a coma. One of her first remarks when she wakes up is directed at some white cops: "You're the good guys. How come it doesn't feel like that?" Shattered, she helps regain her poise with the aid of a black cop (Terrence Howard) and a saintly black woman friend. Meanwhile, in addition to the murderous gang of white kids, another villain emerges: a white businessman who owns parking garages.
This is more a checklist than a plausible plot, particularly when Ms. Foster's character goes on a "Death Wish"-style rampage that requires the New York of 2007 to be portrayed as a place where you're liable to witness a shooting every time you walk into a deli for a pack of gum. Nevertheless, she takes action, sometimes in self-defense but also by launching a pre-emptive, non-U.N.-sanctioned war against big-city thuggery. Behind her she leaves a trail of surprised-looking corpses, the audience cheering each one.
How can this be, since liberals renounce violence, even when directed against antlered pests or convicted serial killers, and greeted Mayor Rudy Giuliani's crackdown on crime with, at best, sullen silence? The movie lets its heroine off the hook by implying that victim status has left her without control of herself, a notion she articulates with more NPR-speak ("inside you there is a stranger, one that has your arms, your legs, your eyes--a sleepless, restless stranger").
This paper's movie critic, Joe Morgenstern, derided that element as "modern-day Jekyll and Hydeism," but it dovetails with two favorite liberal habits: to follow the psychological chain of causation behind a crime so far back that responsibility disappears in a blurry landscape of greater evil, and to maintain a fig-leaf of deniability for lawless actions.
Two of the most highly acclaimed films of recent years are "The Bourne Ultimatum" and "A History of Violence," which received respective approval ratings of 97% and 94% from prominent, or "cream of the crop," critics polled on the review aggregator Rotten Tomatoes.com ("Brokeback Mountain" managed only 90%). Both "Bourne" and "Violence" are built on Jekyll-and-Hydeism, with Hyde out for revenge against those who made him. National Review's Ross Douthat put it neatly, calling Jason Bourne, an amnesiac CIA assassin bent on destroying his trainers in the agency, "a John McClane that even Noam Chomsky can love . . . because Bourne himself resolves the great contradiction bedeviling the liberal action movie--namely, how do you make your hero a killing machine . . . without making him politically incorrect and illiberal along the way? The ingenious solution is to make Bourne an unwilling killing machine, a man whose body is a weapon that his amnesia-ridden mind doesn't understand, or even quite control." Scenes of a captive Bourne being hooded and waterboarded in intentionally dehumanizing training sessions also tickle one of the left's most accessible pleasure centers, the one marked Gitmo guilt.
Similarly, in "A History of Violence," the hero, a diner owner who calls himself Tom Stall, spends most of the movie seemingly unaware that he is actually a mob soldier named Joey Cusack. But when push comes to shoot, he is able to unearth his buried brutality and kill his gangster bosses. Critical acclaim for this formula mob drama centered on the guilt and malaise that suffuses its bloodbaths. "There's something undeniably exciting about Tom's heroic actions . . . but there is something irredeemable and soul-killing here, too," said New York Times critic Manohla Dargis, and the film hints that America itself is a kind of two-faced assassin. Kenneth Turan of the L.A. Times called it "a tightly controlled film about an out-of-control situation: the predilection for violence in America." The film's Canadian director, David Cronenberg, noted the adoring New York Times, "is taking aim at this country, to be sure." High praise indeed.
You are only a little more likely to see a clearly pro-American film at theaters today than you would have been to see a pro-Stalin one in the 1950s. One exception is the new thriller "The Kingdom," about a terrorist bombing in Saudi Arabia, but even in that one the American hero is obliged to say he knows his country isn't perfect.
Left-wing talking points pop up in genre films, too. The X-Men saga, for example, is a thinly veiled plea for greater acceptance of gays, and the otherwise unremarkable horror flick "The Hills Have Eyes 2," in which a woefully underprepared National Guard unit wanders through the Southwestern desert while being jumped and mauled by bloodthirsty mutants (who are in turn a legacy of American nuclear testing in the region), is an allegory for the Iraq War, and maybe the war in Afghanistan as well.
The makers of these films must be disappointed, though, that audiences remain more interested in crisp revenge than messy guilt. Steven Spielberg's "Munich," for instance, which chastises Israel for retaliating against the Palestinians involved in the 1972 Munich Olympics massacre, was completely misread by a character in this summer's hit comedy "Knocked Up," who was cheered by audiences when he said: "That movie was [star] Eric Bana kicking f--ing ass! In every movie with Jews, we're the ones getting killed. 'Munich' flips it on its ear. We're capping motherf--ers!" Americans made it clear which film they thought missed the point: In the U.S., "Knocked Up" earned more than three times as much at the box office as "Munich."
By Kyle Smith
Friday, September 28, 2007 12:01 a.m.
What has come over liberals? Suddenly they've turned bloodthirsty. And they're not just lobbing "Daily Show" coffee mugs or brandishing the rusty business end of their DEAN 2004 campaign pins. Liberals are locked, loaded and licensed to kill--at the movies.
The new Jodie Foster film, "The Brave One," is the latest in a string of left-wing Bush-era movies about violence. These films--which range from popcorn flicks (the "X-Men" series, "The Hills Have Eyes 2") to more ambitious works and Oscar nominees ("A History of Violence," "V for Vendetta," "Munich," "Blood Diamond")--so deeply entangle killing with liberal idealism, though, that at times their scripts are as muddled as EEOC directives or U.N. rules of engagement. For all of the critical acclaim that attended most of these films, few are as effective as "Dirty Harry" or "Death Wish."
In "The Brave One," for instance, possibly the first vigilante movie to feature a Sarah McLachlan soundtrack, a New York radio personality (Jodie Foster) specializes in monologues about the sounds of the city. She speaks with a maximum of NPR narcoleptic condescension, chewing each syllable of her airy drivel ("Are we going to have to construct an imaginary city to house our memories?") as if reading to a toddler out of "My First Book of Cultural Anthropology." Strangely, however, she is not the bad guy.
After her fiancé (apparently a Briton of Indian descent) gets killed when both of them are jumped by vicious white youths, Ms. Foster's character spends weeks in a coma. One of her first remarks when she wakes up is directed at some white cops: "You're the good guys. How come it doesn't feel like that?" Shattered, she helps regain her poise with the aid of a black cop (Terrence Howard) and a saintly black woman friend. Meanwhile, in addition to the murderous gang of white kids, another villain emerges: a white businessman who owns parking garages.
This is more a checklist than a plausible plot, particularly when Ms. Foster's character goes on a "Death Wish"-style rampage that requires the New York of 2007 to be portrayed as a place where you're liable to witness a shooting every time you walk into a deli for a pack of gum. Nevertheless, she takes action, sometimes in self-defense but also by launching a pre-emptive, non-U.N.-sanctioned war against big-city thuggery. Behind her she leaves a trail of surprised-looking corpses, the audience cheering each one.
How can this be, since liberals renounce violence, even when directed against antlered pests or convicted serial killers, and greeted Mayor Rudy Giuliani's crackdown on crime with, at best, sullen silence? The movie lets its heroine off the hook by implying that victim status has left her without control of herself, a notion she articulates with more NPR-speak ("inside you there is a stranger, one that has your arms, your legs, your eyes--a sleepless, restless stranger").
This paper's movie critic, Joe Morgenstern, derided that element as "modern-day Jekyll and Hydeism," but it dovetails with two favorite liberal habits: to follow the psychological chain of causation behind a crime so far back that responsibility disappears in a blurry landscape of greater evil, and to maintain a fig-leaf of deniability for lawless actions.
Two of the most highly acclaimed films of recent years are "The Bourne Ultimatum" and "A History of Violence," which received respective approval ratings of 97% and 94% from prominent, or "cream of the crop," critics polled on the review aggregator Rotten Tomatoes.com ("Brokeback Mountain" managed only 90%). Both "Bourne" and "Violence" are built on Jekyll-and-Hydeism, with Hyde out for revenge against those who made him. National Review's Ross Douthat put it neatly, calling Jason Bourne, an amnesiac CIA assassin bent on destroying his trainers in the agency, "a John McClane that even Noam Chomsky can love . . . because Bourne himself resolves the great contradiction bedeviling the liberal action movie--namely, how do you make your hero a killing machine . . . without making him politically incorrect and illiberal along the way? The ingenious solution is to make Bourne an unwilling killing machine, a man whose body is a weapon that his amnesia-ridden mind doesn't understand, or even quite control." Scenes of a captive Bourne being hooded and waterboarded in intentionally dehumanizing training sessions also tickle one of the left's most accessible pleasure centers, the one marked Gitmo guilt.
Similarly, in "A History of Violence," the hero, a diner owner who calls himself Tom Stall, spends most of the movie seemingly unaware that he is actually a mob soldier named Joey Cusack. But when push comes to shoot, he is able to unearth his buried brutality and kill his gangster bosses. Critical acclaim for this formula mob drama centered on the guilt and malaise that suffuses its bloodbaths. "There's something undeniably exciting about Tom's heroic actions . . . but there is something irredeemable and soul-killing here, too," said New York Times critic Manohla Dargis, and the film hints that America itself is a kind of two-faced assassin. Kenneth Turan of the L.A. Times called it "a tightly controlled film about an out-of-control situation: the predilection for violence in America." The film's Canadian director, David Cronenberg, noted the adoring New York Times, "is taking aim at this country, to be sure." High praise indeed.
You are only a little more likely to see a clearly pro-American film at theaters today than you would have been to see a pro-Stalin one in the 1950s. One exception is the new thriller "The Kingdom," about a terrorist bombing in Saudi Arabia, but even in that one the American hero is obliged to say he knows his country isn't perfect.
Left-wing talking points pop up in genre films, too. The X-Men saga, for example, is a thinly veiled plea for greater acceptance of gays, and the otherwise unremarkable horror flick "The Hills Have Eyes 2," in which a woefully underprepared National Guard unit wanders through the Southwestern desert while being jumped and mauled by bloodthirsty mutants (who are in turn a legacy of American nuclear testing in the region), is an allegory for the Iraq War, and maybe the war in Afghanistan as well.
The makers of these films must be disappointed, though, that audiences remain more interested in crisp revenge than messy guilt. Steven Spielberg's "Munich," for instance, which chastises Israel for retaliating against the Palestinians involved in the 1972 Munich Olympics massacre, was completely misread by a character in this summer's hit comedy "Knocked Up," who was cheered by audiences when he said: "That movie was [star] Eric Bana kicking f--ing ass! In every movie with Jews, we're the ones getting killed. 'Munich' flips it on its ear. We're capping motherf--ers!" Americans made it clear which film they thought missed the point: In the U.S., "Knocked Up" earned more than three times as much at the box office as "Munich."
Labels:
Anti-Americanism,
Hypocrisy,
Ignorance,
Liberals,
Media Bias
Democrats and Iran
Hillary outsmarts her dovish competition.
Wall Street Journal
Friday, September 28, 2007 12:01 a.m.
Kudos to Hillary Clinton--yes, you read that right--for her Senate vote this week urging the U.S. to designate Iran's Islamic Revolutionary Guards Corps as a foreign terrorist organization. That's more than can be said for her primary competition of Barack Obama, Chris Dodd, Bill Richardson and John Edwards, who assailed her on this score at Wednesday's Democratic Presidential candidates debate at Dartmouth. These are men who seem to fear the Netroots more than the mullahs.
Mrs. Clinton's vote was on a symbolic amendment offered by Connecticut maverick Joe Lieberman and Republicans Jon Kyl and Norm Coleman. After marshaling the evidence of Iran's terrorist activities in Iraq, the amendment stated that "it is a critical national interest of the United States to prevent [Iran] from turning Shi'a militia into a Hezbollah-like force that could serve its interests inside Iraq." Twenty-one Democrats, including Joe Biden and John Kerry, apparently found this too shocking to support and voted nay, as did Republicans Chuck Hagel and Dick Lugar.
We probably shouldn't complain when 76 Senators, including a majority of Democrats, show some foreign-policy sense. Still, it's telling that the Democrats only agreed to the amendment after demanding that its language be edited to remove a statement that "it should be the policy of the United States to stop inside Iraq the violent activities and destabilizing influence" of Iran. Also left on the cutting-room floor, under Democratic duress, was a call "to support the prudent and calibrated use of all instruments of United States national power in Iraq" with respect to Iran and its proxies.
The mullahs are supplying the shaped-explosive charges to Shiite militias that are killing or maiming Americas in Iraq. But these Senators are afraid even to suggest that the U.S. might use some kind of military force to save the lives of American soldiers. And they want to be Commander in Chief?
At Dartmouth, Mrs. Clinton defended her vote by noting that it "gives us the options to be able to impose sanctions on the primary leaders to try to begin to put some teeth into all this talk about dealing with Iran." That's right. With Americans having just had a Close Encounter of the Third Kind with Mahmoud Ahmadinejad, it's no surprise that her relative hawkishness is only widening her primary lead.
Wall Street Journal
Friday, September 28, 2007 12:01 a.m.
Kudos to Hillary Clinton--yes, you read that right--for her Senate vote this week urging the U.S. to designate Iran's Islamic Revolutionary Guards Corps as a foreign terrorist organization. That's more than can be said for her primary competition of Barack Obama, Chris Dodd, Bill Richardson and John Edwards, who assailed her on this score at Wednesday's Democratic Presidential candidates debate at Dartmouth. These are men who seem to fear the Netroots more than the mullahs.
Mrs. Clinton's vote was on a symbolic amendment offered by Connecticut maverick Joe Lieberman and Republicans Jon Kyl and Norm Coleman. After marshaling the evidence of Iran's terrorist activities in Iraq, the amendment stated that "it is a critical national interest of the United States to prevent [Iran] from turning Shi'a militia into a Hezbollah-like force that could serve its interests inside Iraq." Twenty-one Democrats, including Joe Biden and John Kerry, apparently found this too shocking to support and voted nay, as did Republicans Chuck Hagel and Dick Lugar.
We probably shouldn't complain when 76 Senators, including a majority of Democrats, show some foreign-policy sense. Still, it's telling that the Democrats only agreed to the amendment after demanding that its language be edited to remove a statement that "it should be the policy of the United States to stop inside Iraq the violent activities and destabilizing influence" of Iran. Also left on the cutting-room floor, under Democratic duress, was a call "to support the prudent and calibrated use of all instruments of United States national power in Iraq" with respect to Iran and its proxies.
The mullahs are supplying the shaped-explosive charges to Shiite militias that are killing or maiming Americas in Iraq. But these Senators are afraid even to suggest that the U.S. might use some kind of military force to save the lives of American soldiers. And they want to be Commander in Chief?
At Dartmouth, Mrs. Clinton defended her vote by noting that it "gives us the options to be able to impose sanctions on the primary leaders to try to begin to put some teeth into all this talk about dealing with Iran." That's right. With Americans having just had a Close Encounter of the Third Kind with Mahmoud Ahmadinejad, it's no surprise that her relative hawkishness is only widening her primary lead.
Why We're Winning Now in Iraq
Anbar's citizens needed protection before they would give their "hearts and minds."
By Frederick W. Kagan
Friday, September 28, 2007 12:01 a.m.
Many politicians and pundits in Washington have ignored perhaps the most important point made by Gen. David Petraeus in his recent congressional testimony: The defeat of al Qaeda in Iraq requires a combination of conventional forces, special forces and local forces. This realization has profound implications not only for American strategy in Iraq, but also for the future of the war on terror.
As Gen. Petraeus made clear, the adoption of a true counterinsurgency strategy in Iraq in January 2007 has led to unprecedented progress in the struggle against al Qaeda in Iraq, by protecting Sunni Arabs who reject the terrorists among them from the vicious retribution of those terrorists. In his address to the United Nations General Assembly Wednesday, Iraqi Prime Minister Nouri al-Maliki also touted the effectiveness of this strategy while at the same time warning of al Qaeda in Iraq's continued threat to his government and indeed the entire region.
Yet despite the undeniable successes the new strategy has achieved against al Qaeda in Iraq, many in Congress are still pushing to change the mission of U.S. forces back to a counterterrorism role relying on special forces and precision munitions to conduct targeted attacks on terrorist leaders. This change would bring us back to the traditional, consensus strategy for dealing with cellular terrorist groups like al Qaeda--a strategy that has consistently failed in Iraq.
Since the 9/11 terrorist attacks, the consensus of American strategists has been that the best way to fight a cellular terrorist organization like al Qaeda is through a combination of targeted strikes against key leaders and efforts to discredit al Qaeda's takfiri ideology in the Muslim community. Precision-guided munitions and special forces have been touted as the ideal weapons against this sort of group, because they require a minimal presence on the ground and therefore do not create the image of American invasion or occupation of a Muslim country.
A correlative assumption has often been that the visible presence of Western troops in Muslim lands creates more terrorists than it eliminates. The American attack on the Taliban in 2001 is often held up now--as it was at the time--as an exemplar of the right way to do things in this war: Small numbers of special forces worked with indigenous Afghan resistance fighters to defeat the Taliban and drive out al Qaeda without the infusion of large numbers of American ground forces. For many, Afghanistan is the virtuous war (contrasting with Iraq) not only because it was fought against the group that planned the 9/11 attacks, but also because it was fought in accord with accepted theories of fighting cellular terrorist organizations.
This strategy failed in Iraq for four years--skilled U.S. special-forces teams killed a succession of al Qaeda in Iraq leaders, but the organization was able to replace them faster than we could kill them. A counterterrorism strategy that did not secure the population from terrorist attacks led to consistent increases in terrorist violence and exposed Sunni leaders disenchanted with the terrorists to brutal death whenever they tried to resist. It emerged that "winning the hearts and minds" of the local population is not enough when the terrorists are able to torture and kill anyone who tries to stand up against them.
Despite an extremely aggressive counterterrorism campaign, by the end of 2006, al Qaeda in Iraq had heavily fortified strongholds equipped with media centers, torture chambers, weapons depots and training areas throughout Anbar province; in Baghdad; in Baqubah and other parts of Diyala province; in Arab Jabour and other villages south of Baghdad; and in various parts of Salah-ad-Din province north of the capital. Al Qaeda in Iraq was blending with the Sunni Arab insurgency in a relationship of mutual support. It was able to conduct scores of devastating, spectacular attacks against Shiite and other targets. Killing al Qaeda leaders in targeted raids had failed utterly either to prevent al Qaeda in Iraq from establishing safe havens throughout Iraq or to control the terrorist violence.
The Sunni Arabs in Iraq lost their enthusiasm for al Qaeda very quickly after their initial embrace of the movement. By 2005, currents of resistance had begun to flow in Anbar, expanding in 2006. Al Qaeda responded to this rising resistance with unspeakable brutality--beheading young children, executing Sunni leaders and preventing their bodies from being buried within the time required by Muslim law, torturing resisters by gouging out their eyes, electrocuting them, crushing their heads in vices, and so on. This brutality naturally inflamed the desire to resist in the Sunni Arab community--but actual resistance in 2006 remained fitful and ineffective. There was no power in Anbar or anywhere that could protect the resisters against al Qaeda retribution, and so al Qaeda continued to maintain its position by force among a population that had initially welcomed it willingly.
The proof? In all of 2006, there were only 1,000 volunteers to join the Iraqi Security Forces in Anbar, despite rising resentment against al Qaeda. Voluntarism was kept down by al Qaeda attacks against ISF recruiting stations and targeted attacks on the families of volunteers. Although tribal leaders had begun to turn against the terrorists, American forces remained under siege in the provincial capital of Ramadi--they ultimately had to level all of the buildings around their headquarters to secure it from constant attack. An initial clearing operation conducted by Col. Sean MacFarland established forward positions in Ramadi with tremendous difficulty and at great cost, but the city was not cleared; attacks on American forces remained extremely high; and the terrorist safe-havens in the province were largely intact.
This year has been a different story in Anbar, and elsewhere in Iraq. The influx of American forces in support of a counterinsurgency strategy--more than 4,000 went into Anbar--allowed U.S. commanders to take hold of the local resentment against al Qaeda by promising to protect those who resisted the terrorists. When American forces entered al Qaeda strongholds like Arab Jabour, the first question the locals asked is: Are you going to stay this time? They wanted to know if the U.S. would commit to protecting them against al Qaeda retribution. U.S. soldiers have done so, in Anbar, Baghdad, Baqubah, Arab Jabour and elsewhere. They have established joint security stations with Iraqi soldiers and police throughout urban areas and in villages. They have worked with former insurgents and local people to form "concerned citizens" groups to protect their own neighborhoods. Their presence among the people has generated confidence that al Qaeda will be defeated, resulting in increased information about the movements of al Qaeda operatives and local support for capturing or killing them.
The result was a dramatic turnabout in Anbar itself--in contrast to the 1,000 recruits of last year, there have already been more than 12,000 this year. Insurgent groups like the 1920s Revolution Brigades that had been fighting alongside al Qaeda in 2006 have fractured, with many coming over to fight with the coalition against the terrorists--more than 30,000 Iraq-wide, by some estimates. The tribal movement in Anbar both solidified and spread--there are now counter-al Qaeda movements throughout Central Iraq, including Diyala, Baghdad, Salah-ad-Din, Babil and Ninewah. Only recently an "awakening council" was formed in Mosul, Ninewah's capital, modeled on the Anbar pattern.
A targeted raid killed Abu Musaab al Zarqawi, founder of al Qaeda in Iraq, near Baqubah in June 2006. After that raid, al Qaeda's grip on Baqubah and throughout Diyala only grew stronger. But skillful clearing operations conducted by American forces, augmented by the surge, have driven al Qaeda out of Baqubah almost entirely. The "Baqubah Guardians" now protect that provincial capital against al Qaeda fighters who previously used it as a major base of operations. The old strategy of targeted raids failed in Diyala, as in Anbar and elsewhere throughout Iraq. The new strategy of protecting the population, in combination with targeted raids, has succeeded so well that al Qaeda in Iraq now holds no major urban sanctuary.
This turnabout coincided with an increase in American forces in Iraq and a change in their mission to securing the population. Not only were more American troops moving about the country, but they were much more visible as they established positions spread out among urban populations. According to all the principles of the consensus counterterrorism strategy, the effect of this surge should have been to generate more terrorists and more terrorism. Instead, it enabled the Iraqi people to throw off the terrorists whose ideas they had already rejected, confident that they would be protected from horrible reprisals. It proved that, at least in this case, conventional forces in significant numbers conducting a traditional counterinsurgency mission were absolutely essential to defeating this cellular terrorist group.
What lessons does this example hold for future fights in the War on Terror? First, defeating al Qaeda in Iraq requires continuing an effective counterinsurgency strategy that involves American conventional forces helping Iraqi Security Forces to protect the population in conjunction with targeted strikes. Reverting to a strategy relying only on targeted raids will allow al Qaeda to re-establish itself in Iraq and begin once again to gain strength. In the longer term, we must fundamentally re-evaluate the consensus strategy for fighting the war on terror. Success against al Qaeda in Iraq obviously does not show that the solution to problems in Waziristan, Baluchistan or elsewhere lies in an American-led invasion. Each situation is unique, each al-Qaeda franchise is unique, and responses must be tailored appropriately.
But one thing is clear from the Iraqi experience. It is not enough to persuade a Muslim population to reject al Qaeda's ideology and practice. Someone must also be willing and able to protect that population against the terrorists they had been harboring, something that special forces and long-range missiles alone can't do.
By Frederick W. Kagan
Friday, September 28, 2007 12:01 a.m.
Many politicians and pundits in Washington have ignored perhaps the most important point made by Gen. David Petraeus in his recent congressional testimony: The defeat of al Qaeda in Iraq requires a combination of conventional forces, special forces and local forces. This realization has profound implications not only for American strategy in Iraq, but also for the future of the war on terror.
As Gen. Petraeus made clear, the adoption of a true counterinsurgency strategy in Iraq in January 2007 has led to unprecedented progress in the struggle against al Qaeda in Iraq, by protecting Sunni Arabs who reject the terrorists among them from the vicious retribution of those terrorists. In his address to the United Nations General Assembly Wednesday, Iraqi Prime Minister Nouri al-Maliki also touted the effectiveness of this strategy while at the same time warning of al Qaeda in Iraq's continued threat to his government and indeed the entire region.
Yet despite the undeniable successes the new strategy has achieved against al Qaeda in Iraq, many in Congress are still pushing to change the mission of U.S. forces back to a counterterrorism role relying on special forces and precision munitions to conduct targeted attacks on terrorist leaders. This change would bring us back to the traditional, consensus strategy for dealing with cellular terrorist groups like al Qaeda--a strategy that has consistently failed in Iraq.
Since the 9/11 terrorist attacks, the consensus of American strategists has been that the best way to fight a cellular terrorist organization like al Qaeda is through a combination of targeted strikes against key leaders and efforts to discredit al Qaeda's takfiri ideology in the Muslim community. Precision-guided munitions and special forces have been touted as the ideal weapons against this sort of group, because they require a minimal presence on the ground and therefore do not create the image of American invasion or occupation of a Muslim country.
A correlative assumption has often been that the visible presence of Western troops in Muslim lands creates more terrorists than it eliminates. The American attack on the Taliban in 2001 is often held up now--as it was at the time--as an exemplar of the right way to do things in this war: Small numbers of special forces worked with indigenous Afghan resistance fighters to defeat the Taliban and drive out al Qaeda without the infusion of large numbers of American ground forces. For many, Afghanistan is the virtuous war (contrasting with Iraq) not only because it was fought against the group that planned the 9/11 attacks, but also because it was fought in accord with accepted theories of fighting cellular terrorist organizations.
This strategy failed in Iraq for four years--skilled U.S. special-forces teams killed a succession of al Qaeda in Iraq leaders, but the organization was able to replace them faster than we could kill them. A counterterrorism strategy that did not secure the population from terrorist attacks led to consistent increases in terrorist violence and exposed Sunni leaders disenchanted with the terrorists to brutal death whenever they tried to resist. It emerged that "winning the hearts and minds" of the local population is not enough when the terrorists are able to torture and kill anyone who tries to stand up against them.
Despite an extremely aggressive counterterrorism campaign, by the end of 2006, al Qaeda in Iraq had heavily fortified strongholds equipped with media centers, torture chambers, weapons depots and training areas throughout Anbar province; in Baghdad; in Baqubah and other parts of Diyala province; in Arab Jabour and other villages south of Baghdad; and in various parts of Salah-ad-Din province north of the capital. Al Qaeda in Iraq was blending with the Sunni Arab insurgency in a relationship of mutual support. It was able to conduct scores of devastating, spectacular attacks against Shiite and other targets. Killing al Qaeda leaders in targeted raids had failed utterly either to prevent al Qaeda in Iraq from establishing safe havens throughout Iraq or to control the terrorist violence.
The Sunni Arabs in Iraq lost their enthusiasm for al Qaeda very quickly after their initial embrace of the movement. By 2005, currents of resistance had begun to flow in Anbar, expanding in 2006. Al Qaeda responded to this rising resistance with unspeakable brutality--beheading young children, executing Sunni leaders and preventing their bodies from being buried within the time required by Muslim law, torturing resisters by gouging out their eyes, electrocuting them, crushing their heads in vices, and so on. This brutality naturally inflamed the desire to resist in the Sunni Arab community--but actual resistance in 2006 remained fitful and ineffective. There was no power in Anbar or anywhere that could protect the resisters against al Qaeda retribution, and so al Qaeda continued to maintain its position by force among a population that had initially welcomed it willingly.
The proof? In all of 2006, there were only 1,000 volunteers to join the Iraqi Security Forces in Anbar, despite rising resentment against al Qaeda. Voluntarism was kept down by al Qaeda attacks against ISF recruiting stations and targeted attacks on the families of volunteers. Although tribal leaders had begun to turn against the terrorists, American forces remained under siege in the provincial capital of Ramadi--they ultimately had to level all of the buildings around their headquarters to secure it from constant attack. An initial clearing operation conducted by Col. Sean MacFarland established forward positions in Ramadi with tremendous difficulty and at great cost, but the city was not cleared; attacks on American forces remained extremely high; and the terrorist safe-havens in the province were largely intact.
This year has been a different story in Anbar, and elsewhere in Iraq. The influx of American forces in support of a counterinsurgency strategy--more than 4,000 went into Anbar--allowed U.S. commanders to take hold of the local resentment against al Qaeda by promising to protect those who resisted the terrorists. When American forces entered al Qaeda strongholds like Arab Jabour, the first question the locals asked is: Are you going to stay this time? They wanted to know if the U.S. would commit to protecting them against al Qaeda retribution. U.S. soldiers have done so, in Anbar, Baghdad, Baqubah, Arab Jabour and elsewhere. They have established joint security stations with Iraqi soldiers and police throughout urban areas and in villages. They have worked with former insurgents and local people to form "concerned citizens" groups to protect their own neighborhoods. Their presence among the people has generated confidence that al Qaeda will be defeated, resulting in increased information about the movements of al Qaeda operatives and local support for capturing or killing them.
The result was a dramatic turnabout in Anbar itself--in contrast to the 1,000 recruits of last year, there have already been more than 12,000 this year. Insurgent groups like the 1920s Revolution Brigades that had been fighting alongside al Qaeda in 2006 have fractured, with many coming over to fight with the coalition against the terrorists--more than 30,000 Iraq-wide, by some estimates. The tribal movement in Anbar both solidified and spread--there are now counter-al Qaeda movements throughout Central Iraq, including Diyala, Baghdad, Salah-ad-Din, Babil and Ninewah. Only recently an "awakening council" was formed in Mosul, Ninewah's capital, modeled on the Anbar pattern.
A targeted raid killed Abu Musaab al Zarqawi, founder of al Qaeda in Iraq, near Baqubah in June 2006. After that raid, al Qaeda's grip on Baqubah and throughout Diyala only grew stronger. But skillful clearing operations conducted by American forces, augmented by the surge, have driven al Qaeda out of Baqubah almost entirely. The "Baqubah Guardians" now protect that provincial capital against al Qaeda fighters who previously used it as a major base of operations. The old strategy of targeted raids failed in Diyala, as in Anbar and elsewhere throughout Iraq. The new strategy of protecting the population, in combination with targeted raids, has succeeded so well that al Qaeda in Iraq now holds no major urban sanctuary.
This turnabout coincided with an increase in American forces in Iraq and a change in their mission to securing the population. Not only were more American troops moving about the country, but they were much more visible as they established positions spread out among urban populations. According to all the principles of the consensus counterterrorism strategy, the effect of this surge should have been to generate more terrorists and more terrorism. Instead, it enabled the Iraqi people to throw off the terrorists whose ideas they had already rejected, confident that they would be protected from horrible reprisals. It proved that, at least in this case, conventional forces in significant numbers conducting a traditional counterinsurgency mission were absolutely essential to defeating this cellular terrorist group.
What lessons does this example hold for future fights in the War on Terror? First, defeating al Qaeda in Iraq requires continuing an effective counterinsurgency strategy that involves American conventional forces helping Iraqi Security Forces to protect the population in conjunction with targeted strikes. Reverting to a strategy relying only on targeted raids will allow al Qaeda to re-establish itself in Iraq and begin once again to gain strength. In the longer term, we must fundamentally re-evaluate the consensus strategy for fighting the war on terror. Success against al Qaeda in Iraq obviously does not show that the solution to problems in Waziristan, Baluchistan or elsewhere lies in an American-led invasion. Each situation is unique, each al-Qaeda franchise is unique, and responses must be tailored appropriately.
But one thing is clear from the Iraqi experience. It is not enough to persuade a Muslim population to reject al Qaeda's ideology and practice. Someone must also be willing and able to protect that population against the terrorists they had been harboring, something that special forces and long-range missiles alone can't do.
Wednesday, September 26, 2007
Six inconvenient truths about the U.S. and slavery
By Michael Medved
Wednesday, September 26, 2007
Those who want to discredit the United States and to deny our role as history’s most powerful and pre-eminent force for freedom, goodness and human dignity invariably focus on America’s bloody past as a slave-holding nation. Along with the displacement and mistreatment of Native Americans, the enslavement of literally millions of Africans counts as one of our two founding crimes—and an obvious rebuttal to any claims that this Republic truly represents “the land of the free and the home of the brave.” According to America-bashers at home and abroad, open-minded students of our history ought to feel more guilt than pride, and strive for “reparations” or other restitution to overcome the nation’s uniquely cruel, racist and rapacious legacy.
Unfortunately, the current mania for exaggerating America’s culpability for the horrors of slavery bears no more connection to reality than the old, discredited tendency to deny that the U.S. bore any blame at all. No, it’s not true that the “peculiar institution” featured kind-hearted, paternalistic masters and happy, dancing field-hands, any more than it’s true that America displayed unparalleled barbarity or enjoyed disproportionate benefit from kidnapping and exploiting innocent Africans.
An honest and balanced understanding of the position of slavery in the American experience requires a serious attempt to place the institution in historical context and to clear-away some of the common myths and distortions.
1. SLAVERY WAS AN ANCIENT AND UNIVERSAL INSTITUTION, NOT A DISTINCTIVELY AMERICAN INNOVATION. At the time of the founding of the Republic in 1776, slavery existed literally everywhere on earth and had been an accepted aspect of human history from the very beginning of organized societies. Current thinking suggests that human beings took a crucial leap toward civilization about 10,000 years ago with the submission, training and domestication of important animal species (cows, sheep, swine, goats, chickens, horses and so forth) and, at the same time, began the “domestication,” bestialization and ownership of fellow human beings captured as prisoners in primitive wars. In ancient Greece, the great philosopher Aristotle described the ox as “the poor man’s slave” while Xenophon likened the teaching of slaves “to the training of wild animals.” Aristotle further opined that “it is clear that there are certain people who are free and certain who are slaves by nature, and it is both to their advantage, and just, for them to be slaves.” The Romans seized so many captives from Eastern Europe that the terms “Slav” and “slave” bore the same origins. All the great cultures of the ancient world, from Egypt to Babylonia, Athens to Rome, Persia to India to China, depended upon the brutal enslavement of the masses – often representing heavy majorities of the population. Contrary to the glamorization of aboriginal New World cultures, the Mayas, Aztecs and Incas counted among the most brutal slave-masters of them all --- not only turning the members of other tribes into harshly abused beasts of burden but also using these conquered enemies to feed a limitless lust for human sacrifice. The Tupinamba, a powerful tribe on the coast of Brazil south of the Amazon, took huge numbers of captives, then humiliated them for months or years, before engaging in mass slaughter of their victims in ritualized cannibalistic feasts. In Africa, slavery also represented a timeless norm long before any intrusion by Europeans. Moreover, the Portuguese, Spanish, Dutch or British slave traders rarely penetrated far beyond the coasts: the actual capture and kidnapping of the millions of victims always occurred at the hands of neighboring tribes. As the great African-American historian Nathan Huggins pointed out, “virtually all of the enslavement of Africans was carried out by other Africans” but the concept of an African “race” was the invention of Western colonists, and most African traders “saw themselves as selling people other than their own.” In the final analysis, Yale historian David Brion Davis in his definitive 2006 history “Inhuman Bondage: The Rise and Fall of Slavery in the New World” notes that “colonial North America…surprisingly received only 5 to 6 percent of the African slaves shipped across the Atlantic.” Meanwhile, the Arab slave trade (primarily from East Africa) lasted longer and enslaved more human beings than the European slavers working the other side of the continent. According to the best estimates, Islamic societies shipped between 12 and 17 million African slaves out of their homes in the course of a thousand years; the best estimate for the number of Africans enslaved by Europeans amounts to 11 million. In other words, when taking the prodigious and unspeakably cruel Islamic enslavements into the equation, at least 97% of all African men, women and children who were kidnapped, sold, and taken from their homes, were sent somewhere other than the British colonies of North America. In this context there is no historical basis to claim that the United States bears primary, or even prominent guilt for the depredations of centuries of African slavery.
2. SLAVERY EXISTED ONLY BRIEFLY, AND IN LIMITED LOCALES, IN THE HISTORY OF THE REPUBLIC – INVOLVING ONLY A TINY PERCENTAGE OF THE ANCESTORS OF TODAY’S AMERICANS. The Thirteenth Amendment to the Constitution put a formal end to the institution of slavery 89 years after the birth of the Republic; 142 years have passed since this welcome emancipation. Moreover, the importation of slaves came to an end in 1808 (as provided by the Constitution), a mere 32 years after independence, and slavery had been outlawed in most states decades before the Civil War. Even in the South, more than 80% of the white population never owned slaves. Given the fact that the majority of today’s non-black Americans descend from immigrants who arrived in this country after the War Between the States, only a tiny percentage of today’s white citizens – perhaps as few as 5% -- bear any authentic sort of generational guilt for the exploitation of slave labor. Of course, a hundred years of Jim Crow laws, economic oppression and indefensible discrimination followed the theoretical emancipation of the slaves, but those harsh realities raise different issues from those connected to the long-ago history of bondage.
3. THOUGH BRUTAL, SLAVERY WASN’T GENOCIDAL: LIVE SLAVES WERE VALUABLE BUT DEAD CAPTIVES BROUGHT NO PROFIT. Historians agree that hundreds of thousands, and probably millions of slaves perished over the course of 300 years during the rigors of the “Middle Passage” across the Atlantic Ocean. Estimates remain inevitably imprecise, but range as high as one third of the slave “cargo” who perished from disease or overcrowding during transport from Africa. Perhaps the most horrifying aspect of these voyages involves the fact that no slave traders wanted to see this level of deadly suffering: they benefited only from delivering (and selling) live slaves, not from tossing corpses into the ocean. By definition, the crime of genocide requires the deliberate slaughter of a specific group of people; slavers invariably preferred oppressing and exploiting live Africans rather than murdering them en masse. Here, the popular, facile comparisons between slavery and the Holocaust quickly break down: the Nazis occasionally benefited from the slave labor of their victims, but the ultimate purpose of facilities like Auschwitz involved mass death, not profit or productivity. For slave owners and slave dealers in the New World, however, death of your human property cost you money, just as the death of your domestic animals would cause financial damage. And as with their horses and cows, slave owners took pride and care in breeding as many new slaves as possible. Rather than eliminating the slave population, profit-oriented masters wanted to produce as many new, young slaves as they could. This hardly represents a compassionate or decent way to treat your fellow human beings, but it does amount to the very opposite of genocide. As David Brion Davis reports, slave holders in North America developed formidable expertise in keeping their “bondsmen” alive and healthy enough to produce abundant offspring. The British colonists took pride in slaves who “developed an almost unique and rapid rate of population growth, freeing the later United States from a need for further African imports.”
4. IT’S NOT TRUE THAT THE U.S. BECAME A WEALTHY NATION THROUGH THE ABUSE OF SLAVE LABOR: THE MOST PROSPEROUS STATES IN THE COUNTRY WERE THOSE THAT FIRST FREED THEIR SLAVES. Pennsylvania passed an emancipation law in 1780; Connecticut and Rhode Island followed four years later (all before the Constitution). New York approved emancipation in 1799. These states (with dynamic banking centers in Philadelphia and Manhattan) quickly emerged as robust centers of commerce and manufacturing, greatly enriching themselves while the slave-based economies in the South languished by comparison. At the time of the Constitution, Virginia constituted the most populous and wealthiest state in the Union, but by the time of the War Between the States the Old Dominion had fallen far behind a half-dozen northern states that had outlawed slavery two generations earlier. All analyses of Northern victory in the great sectional struggle highlights the vast advantages in terms of wealth and productivity in New England, the Mid-Atlantic States and the Midwest, compared to the relatively backward and impoverished states of the Confederacy. While a few elite families in the Old South undoubtedly based their formidable fortunes on the labor of slaves, the prevailing reality of the planter class involved chronic indebtedness and shaky finances long before the ultimate collapse of the evil system of bondage. The notion that America based its wealth and development on slave labor hardly comports with the obvious reality that for two hundred years since the founding of the Republic, by far the poorest and least developed section of the nation was precisely that region where slavery once prevailed.
5. WHILE AMERICA DESERVES NO UNIQUE BLAME FOR THE EXISTENCE OF SLAVERY, THE UNITED STATES MERITS SPECIAL CREDIT FOR ITS RAPID ABOLITION. In the course of scarcely more than a century following the emergence of the American Republic, men of conscience, principle and unflagging energy succeeded in abolishing slavery not just in the New World but in all nations of the West. During three eventful generations, one of the most ancient, ubiquitous and unquestioned of all human institutions (considered utterly indispensable by the “enlightened” philosophers of Greece and Rome) became universally discredited and finally illegal – with Brazil at last liberating all its slaves in 1888. This worldwide mass movement (spear-headed in Britain and elsewhere by fervent Evangelical Christians) brought about the most rapid and fundamental transformation in all human history. While the United States (and the British colonies that preceded our independence) played no prominent role in creating the institution of slavery, or even in establishing the long-standing African slave trade pioneered by Arab, Portuguese, Spanish, Dutch and other merchants long before the settlement of English North America, Americans did contribute mightily to the spectacularly successful anti-slavery agitation. As early as 1646, the Puritan founders of New England expressed their revulsion at the enslavement of their fellow children of God. When magistrates in Massachusetts discovered that some of their citizens had raided an African village and violently seized two natives to bring them across the Atlantic for sale in the New World, the General Court condemned “this haynos and crying sinn of man-stealing.” The officials promptly ordered the two blacks returned to their native land. Two years later, Rhode Island passed legislation denouncing the practice of enslaving Africans for life and ordered that any slaves “brought within the liberties of this Collonie” be set free after ten years “as the manner is with the English servants.” A hundred and thirty years later John Adams and Benjamin Franklin both spent most of their lives as committed activists in the abolitionist cause, and Thomas Jefferson included a bitter condemnation of slavery in his original draft of the Declaration of Independence. This remarkable passage saw African bondage as “cruel war against human nature itself, violating its most sacred rights of life & liberty” and described “a market where MEN should be bought and sold” as constituting “piratical warfare” and “execrable commerce.” Unfortunately, the Continental Congress removed this prescient, powerful denunciation in order to win approval from Jefferson’s fellow slave-owners, but the impact of the Declaration and the American Revolution remained a powerful factor in energizing and inspiring the international anti-slavery cause. Nowhere did idealists pay a higher price for liberation than they did in the United States of America. Confederate forces (very few of whom ever owned slaves) may not have fought consciously to defend the Peculiar Institution, but Union soldiers and sailors (particularly at the end of the war) proudly risked their lives for the emancipation cause. Julia Ward Howe’s powerful and popular “Battle Hymn of the Republic” called on Federal troops to follow Christ’s example: “as he died to make men holy/let us die to make men free.” And many of them did die, some 364,000 in four years of combat—or the stunning equivalent of five million deaths as a percentage of today’s United States population. Moreover, the economic cost of liberation remained almost unimaginable. In nearly all other nations, the government paid some form of compensation to slave-owners at the time of emancipation, but Southern slave-owners received no reimbursement of any kind when they lost an estimated $3.5 billion in 1860 dollars (about $70 billion in today’s dollars) of what Davis describes as a “hitherto legally accepted form of property.” The most notable aspect of America’s history with slavery doesn’t involve its tortured and bloody existence, but the unprecedented speed and determination with which abolitionists roused the national conscience and put this age-old evil to an end.
6. THERE IS NO REASON TO BELIEVE THAT TODAY’S AFRICAN-AMERICANS WOULD BE BETTER OFF IF THEIR ANCESTORS HAD REMAINED BEHIND IN AFRICA. The idea of reparations rests on the notion of making up to the descendants of slaves for the incalculable damage done to their family status and welfare by the enslavement of generations of their ancestors. In theory, reparationists want society to repair the wrongs of the past by putting today’s African-Americans into the sort of situation they would have enjoyed if their forebears hadn’t been kidnapped, sold and transported across the ocean. Unfortunately, to bring American blacks in line with their cousins who the slave-traders left behind in Africa would require a drastic reduction in their wealth, living standards, and economic and political opportunities. No honest observer can deny or dismiss this nation’s long record of racism and injustice, but it’s also obvious that Americans of African descent enjoy vastly greater wealth and human rights of every variety than the citizens of any nation of the Mother Continent. If we sought to erase the impact of slavery on specific black families, we would need to obliterate the spectacular economic progress made by those families (and by US citizens in general) over the last 100 years. In view of the last century of history in Nigeria or Ivory Coast or Sierra Leone or Zimbabwe, could any African American say with confidence that he or she would have fared better had some distant ancestor not been enslaved? Of course, those who seek reparations would also cite the devastating impact of Western colonialism in stunting African progress, but the United States played virtually no role in the colonization of the continent. The British, French, Italians, Portuguese, Germans and others all established brutal colonial rule in Africa; tiny Belgium became a particularly oppressive and bloodthirsty colonial power in the Congo. The United States, on the other hand, sponsored only one long-term venture on the African continent: the colony of Liberia, an independent nation set up as a haven for liberated American slaves who wanted to go “home.” The fact that so few availed themselves of the opportunity, or heeded the back-to-African exhortations of turn- of-the-century Black Nationalist Marcus Garvey, reflects the reality that descendants of slaves understood they were better off remaining in the United States, for all its faults.
In short, politically correct assumptions about America’s entanglement with slavery lack any sense of depth, perspective or context. As with so many other persistent lies about this fortunate land, the unthinking indictment of the United States as uniquely blameworthy for an evil institution ignores the fact that the record of previous generations provides some basis for pride as well as guilt.
Wednesday, September 26, 2007
Those who want to discredit the United States and to deny our role as history’s most powerful and pre-eminent force for freedom, goodness and human dignity invariably focus on America’s bloody past as a slave-holding nation. Along with the displacement and mistreatment of Native Americans, the enslavement of literally millions of Africans counts as one of our two founding crimes—and an obvious rebuttal to any claims that this Republic truly represents “the land of the free and the home of the brave.” According to America-bashers at home and abroad, open-minded students of our history ought to feel more guilt than pride, and strive for “reparations” or other restitution to overcome the nation’s uniquely cruel, racist and rapacious legacy.
Unfortunately, the current mania for exaggerating America’s culpability for the horrors of slavery bears no more connection to reality than the old, discredited tendency to deny that the U.S. bore any blame at all. No, it’s not true that the “peculiar institution” featured kind-hearted, paternalistic masters and happy, dancing field-hands, any more than it’s true that America displayed unparalleled barbarity or enjoyed disproportionate benefit from kidnapping and exploiting innocent Africans.
An honest and balanced understanding of the position of slavery in the American experience requires a serious attempt to place the institution in historical context and to clear-away some of the common myths and distortions.
1. SLAVERY WAS AN ANCIENT AND UNIVERSAL INSTITUTION, NOT A DISTINCTIVELY AMERICAN INNOVATION. At the time of the founding of the Republic in 1776, slavery existed literally everywhere on earth and had been an accepted aspect of human history from the very beginning of organized societies. Current thinking suggests that human beings took a crucial leap toward civilization about 10,000 years ago with the submission, training and domestication of important animal species (cows, sheep, swine, goats, chickens, horses and so forth) and, at the same time, began the “domestication,” bestialization and ownership of fellow human beings captured as prisoners in primitive wars. In ancient Greece, the great philosopher Aristotle described the ox as “the poor man’s slave” while Xenophon likened the teaching of slaves “to the training of wild animals.” Aristotle further opined that “it is clear that there are certain people who are free and certain who are slaves by nature, and it is both to their advantage, and just, for them to be slaves.” The Romans seized so many captives from Eastern Europe that the terms “Slav” and “slave” bore the same origins. All the great cultures of the ancient world, from Egypt to Babylonia, Athens to Rome, Persia to India to China, depended upon the brutal enslavement of the masses – often representing heavy majorities of the population. Contrary to the glamorization of aboriginal New World cultures, the Mayas, Aztecs and Incas counted among the most brutal slave-masters of them all --- not only turning the members of other tribes into harshly abused beasts of burden but also using these conquered enemies to feed a limitless lust for human sacrifice. The Tupinamba, a powerful tribe on the coast of Brazil south of the Amazon, took huge numbers of captives, then humiliated them for months or years, before engaging in mass slaughter of their victims in ritualized cannibalistic feasts. In Africa, slavery also represented a timeless norm long before any intrusion by Europeans. Moreover, the Portuguese, Spanish, Dutch or British slave traders rarely penetrated far beyond the coasts: the actual capture and kidnapping of the millions of victims always occurred at the hands of neighboring tribes. As the great African-American historian Nathan Huggins pointed out, “virtually all of the enslavement of Africans was carried out by other Africans” but the concept of an African “race” was the invention of Western colonists, and most African traders “saw themselves as selling people other than their own.” In the final analysis, Yale historian David Brion Davis in his definitive 2006 history “Inhuman Bondage: The Rise and Fall of Slavery in the New World” notes that “colonial North America…surprisingly received only 5 to 6 percent of the African slaves shipped across the Atlantic.” Meanwhile, the Arab slave trade (primarily from East Africa) lasted longer and enslaved more human beings than the European slavers working the other side of the continent. According to the best estimates, Islamic societies shipped between 12 and 17 million African slaves out of their homes in the course of a thousand years; the best estimate for the number of Africans enslaved by Europeans amounts to 11 million. In other words, when taking the prodigious and unspeakably cruel Islamic enslavements into the equation, at least 97% of all African men, women and children who were kidnapped, sold, and taken from their homes, were sent somewhere other than the British colonies of North America. In this context there is no historical basis to claim that the United States bears primary, or even prominent guilt for the depredations of centuries of African slavery.
2. SLAVERY EXISTED ONLY BRIEFLY, AND IN LIMITED LOCALES, IN THE HISTORY OF THE REPUBLIC – INVOLVING ONLY A TINY PERCENTAGE OF THE ANCESTORS OF TODAY’S AMERICANS. The Thirteenth Amendment to the Constitution put a formal end to the institution of slavery 89 years after the birth of the Republic; 142 years have passed since this welcome emancipation. Moreover, the importation of slaves came to an end in 1808 (as provided by the Constitution), a mere 32 years after independence, and slavery had been outlawed in most states decades before the Civil War. Even in the South, more than 80% of the white population never owned slaves. Given the fact that the majority of today’s non-black Americans descend from immigrants who arrived in this country after the War Between the States, only a tiny percentage of today’s white citizens – perhaps as few as 5% -- bear any authentic sort of generational guilt for the exploitation of slave labor. Of course, a hundred years of Jim Crow laws, economic oppression and indefensible discrimination followed the theoretical emancipation of the slaves, but those harsh realities raise different issues from those connected to the long-ago history of bondage.
3. THOUGH BRUTAL, SLAVERY WASN’T GENOCIDAL: LIVE SLAVES WERE VALUABLE BUT DEAD CAPTIVES BROUGHT NO PROFIT. Historians agree that hundreds of thousands, and probably millions of slaves perished over the course of 300 years during the rigors of the “Middle Passage” across the Atlantic Ocean. Estimates remain inevitably imprecise, but range as high as one third of the slave “cargo” who perished from disease or overcrowding during transport from Africa. Perhaps the most horrifying aspect of these voyages involves the fact that no slave traders wanted to see this level of deadly suffering: they benefited only from delivering (and selling) live slaves, not from tossing corpses into the ocean. By definition, the crime of genocide requires the deliberate slaughter of a specific group of people; slavers invariably preferred oppressing and exploiting live Africans rather than murdering them en masse. Here, the popular, facile comparisons between slavery and the Holocaust quickly break down: the Nazis occasionally benefited from the slave labor of their victims, but the ultimate purpose of facilities like Auschwitz involved mass death, not profit or productivity. For slave owners and slave dealers in the New World, however, death of your human property cost you money, just as the death of your domestic animals would cause financial damage. And as with their horses and cows, slave owners took pride and care in breeding as many new slaves as possible. Rather than eliminating the slave population, profit-oriented masters wanted to produce as many new, young slaves as they could. This hardly represents a compassionate or decent way to treat your fellow human beings, but it does amount to the very opposite of genocide. As David Brion Davis reports, slave holders in North America developed formidable expertise in keeping their “bondsmen” alive and healthy enough to produce abundant offspring. The British colonists took pride in slaves who “developed an almost unique and rapid rate of population growth, freeing the later United States from a need for further African imports.”
4. IT’S NOT TRUE THAT THE U.S. BECAME A WEALTHY NATION THROUGH THE ABUSE OF SLAVE LABOR: THE MOST PROSPEROUS STATES IN THE COUNTRY WERE THOSE THAT FIRST FREED THEIR SLAVES. Pennsylvania passed an emancipation law in 1780; Connecticut and Rhode Island followed four years later (all before the Constitution). New York approved emancipation in 1799. These states (with dynamic banking centers in Philadelphia and Manhattan) quickly emerged as robust centers of commerce and manufacturing, greatly enriching themselves while the slave-based economies in the South languished by comparison. At the time of the Constitution, Virginia constituted the most populous and wealthiest state in the Union, but by the time of the War Between the States the Old Dominion had fallen far behind a half-dozen northern states that had outlawed slavery two generations earlier. All analyses of Northern victory in the great sectional struggle highlights the vast advantages in terms of wealth and productivity in New England, the Mid-Atlantic States and the Midwest, compared to the relatively backward and impoverished states of the Confederacy. While a few elite families in the Old South undoubtedly based their formidable fortunes on the labor of slaves, the prevailing reality of the planter class involved chronic indebtedness and shaky finances long before the ultimate collapse of the evil system of bondage. The notion that America based its wealth and development on slave labor hardly comports with the obvious reality that for two hundred years since the founding of the Republic, by far the poorest and least developed section of the nation was precisely that region where slavery once prevailed.
5. WHILE AMERICA DESERVES NO UNIQUE BLAME FOR THE EXISTENCE OF SLAVERY, THE UNITED STATES MERITS SPECIAL CREDIT FOR ITS RAPID ABOLITION. In the course of scarcely more than a century following the emergence of the American Republic, men of conscience, principle and unflagging energy succeeded in abolishing slavery not just in the New World but in all nations of the West. During three eventful generations, one of the most ancient, ubiquitous and unquestioned of all human institutions (considered utterly indispensable by the “enlightened” philosophers of Greece and Rome) became universally discredited and finally illegal – with Brazil at last liberating all its slaves in 1888. This worldwide mass movement (spear-headed in Britain and elsewhere by fervent Evangelical Christians) brought about the most rapid and fundamental transformation in all human history. While the United States (and the British colonies that preceded our independence) played no prominent role in creating the institution of slavery, or even in establishing the long-standing African slave trade pioneered by Arab, Portuguese, Spanish, Dutch and other merchants long before the settlement of English North America, Americans did contribute mightily to the spectacularly successful anti-slavery agitation. As early as 1646, the Puritan founders of New England expressed their revulsion at the enslavement of their fellow children of God. When magistrates in Massachusetts discovered that some of their citizens had raided an African village and violently seized two natives to bring them across the Atlantic for sale in the New World, the General Court condemned “this haynos and crying sinn of man-stealing.” The officials promptly ordered the two blacks returned to their native land. Two years later, Rhode Island passed legislation denouncing the practice of enslaving Africans for life and ordered that any slaves “brought within the liberties of this Collonie” be set free after ten years “as the manner is with the English servants.” A hundred and thirty years later John Adams and Benjamin Franklin both spent most of their lives as committed activists in the abolitionist cause, and Thomas Jefferson included a bitter condemnation of slavery in his original draft of the Declaration of Independence. This remarkable passage saw African bondage as “cruel war against human nature itself, violating its most sacred rights of life & liberty” and described “a market where MEN should be bought and sold” as constituting “piratical warfare” and “execrable commerce.” Unfortunately, the Continental Congress removed this prescient, powerful denunciation in order to win approval from Jefferson’s fellow slave-owners, but the impact of the Declaration and the American Revolution remained a powerful factor in energizing and inspiring the international anti-slavery cause. Nowhere did idealists pay a higher price for liberation than they did in the United States of America. Confederate forces (very few of whom ever owned slaves) may not have fought consciously to defend the Peculiar Institution, but Union soldiers and sailors (particularly at the end of the war) proudly risked their lives for the emancipation cause. Julia Ward Howe’s powerful and popular “Battle Hymn of the Republic” called on Federal troops to follow Christ’s example: “as he died to make men holy/let us die to make men free.” And many of them did die, some 364,000 in four years of combat—or the stunning equivalent of five million deaths as a percentage of today’s United States population. Moreover, the economic cost of liberation remained almost unimaginable. In nearly all other nations, the government paid some form of compensation to slave-owners at the time of emancipation, but Southern slave-owners received no reimbursement of any kind when they lost an estimated $3.5 billion in 1860 dollars (about $70 billion in today’s dollars) of what Davis describes as a “hitherto legally accepted form of property.” The most notable aspect of America’s history with slavery doesn’t involve its tortured and bloody existence, but the unprecedented speed and determination with which abolitionists roused the national conscience and put this age-old evil to an end.
6. THERE IS NO REASON TO BELIEVE THAT TODAY’S AFRICAN-AMERICANS WOULD BE BETTER OFF IF THEIR ANCESTORS HAD REMAINED BEHIND IN AFRICA. The idea of reparations rests on the notion of making up to the descendants of slaves for the incalculable damage done to their family status and welfare by the enslavement of generations of their ancestors. In theory, reparationists want society to repair the wrongs of the past by putting today’s African-Americans into the sort of situation they would have enjoyed if their forebears hadn’t been kidnapped, sold and transported across the ocean. Unfortunately, to bring American blacks in line with their cousins who the slave-traders left behind in Africa would require a drastic reduction in their wealth, living standards, and economic and political opportunities. No honest observer can deny or dismiss this nation’s long record of racism and injustice, but it’s also obvious that Americans of African descent enjoy vastly greater wealth and human rights of every variety than the citizens of any nation of the Mother Continent. If we sought to erase the impact of slavery on specific black families, we would need to obliterate the spectacular economic progress made by those families (and by US citizens in general) over the last 100 years. In view of the last century of history in Nigeria or Ivory Coast or Sierra Leone or Zimbabwe, could any African American say with confidence that he or she would have fared better had some distant ancestor not been enslaved? Of course, those who seek reparations would also cite the devastating impact of Western colonialism in stunting African progress, but the United States played virtually no role in the colonization of the continent. The British, French, Italians, Portuguese, Germans and others all established brutal colonial rule in Africa; tiny Belgium became a particularly oppressive and bloodthirsty colonial power in the Congo. The United States, on the other hand, sponsored only one long-term venture on the African continent: the colony of Liberia, an independent nation set up as a haven for liberated American slaves who wanted to go “home.” The fact that so few availed themselves of the opportunity, or heeded the back-to-African exhortations of turn- of-the-century Black Nationalist Marcus Garvey, reflects the reality that descendants of slaves understood they were better off remaining in the United States, for all its faults.
In short, politically correct assumptions about America’s entanglement with slavery lack any sense of depth, perspective or context. As with so many other persistent lies about this fortunate land, the unthinking indictment of the United States as uniquely blameworthy for an evil institution ignores the fact that the record of previous generations provides some basis for pride as well as guilt.
Labels:
Academia,
America's Role,
Anti-Americanism,
Europe,
Ignorance,
Media Bias,
Race,
Slavery
Mahmoudapalooza: The Good, the Bad and the Craven
By Michelle Malkin
Wednesday, September 26, 2007
When my children are grown, I can tell them where I was when bloodthirsty Iranian thug-in-chief Mahmoud Ahmadinejad dared to disgrace Columbia University with his presence. I was standing with Jewish leaders, Iranian-American dissidents, World War II veterans and other concerned citizens, young and old, taking a stand against evil outside the campus gates.
Banafsheh Zand-Bonazzi, an Iranian-born activist whose dissident journalist father is jailed in her homeland, was appalled at the ignorance and moral equivalence of the leftists who paraded in front of the TV cameras with their Bush-is-a-terrorist paraphernalia. A few goons held a large banner that read: "Ahmadinejad is bad. Bush is worse."
"It's not always about Bush," Zand-Bonazzi exclaimed after schooling the Ahmadinejad apologists and pointing out fellow Iranian protesters holding signs memorializing persecuted and executed countrymen. The ANSWER mobsters, she fumed, "have their history wrong. They don't see the greater threat. They don't get it."
Rabbi Avi Weiss, a Jewish Orthodox leader from the Bronx, gets it. Standing amid a small but sturdy sea of "Hitler lives" and "Never forget" placards, Rabbi Weiss told me: "The First Amendment means you have the right to invite in the arch-terrorists of the world. It doesn't mean that you are obligated to do so -- especially when this whole visit was initiated by the Iranian mission, and Iranian missions around the world are known to have fomented and orchestrated in the communities where they are." Instead of being feted, Rabbi Weiss said, "this man, who is responsible for contributing to the killing of American troops in Iraq, should be served with papers and hauled into court."
Several anti-Ahmadinejad protesters expressed disappointment that a larger crowd had not turned out in New York City. I concur. Ahmadinejad's nuclear ambitions, Mahdi devotion, Jew hatred, Holocaust denial, human rights repression and American troop-murdering machinery threaten us all. Not just Jews. Not just persecuted Persian activists. Not just military families.
Immediately before landing in the Big Apple, the Iranian madman was grandmaster of a military parade in Tehran punctuated with "Death to America" and "Death to Israel" posters. Newsflash: It's not an either/or death wish.
Lost in the debate over the Columbia "debate" are the jumbo-sized jihadi dots connecting Iran to global Islamic terrorism, including 9/11. The 9/11 Commission Report stated in a section on Iran and the 1996 Khobar Towers bombing that "the evidence of Iranian involvement is strong."
On Iran and al Qaeda partnerships, the report concluded, "there is strong evidence that Iran facilitated the transit of al Qaeda members into and out of Afghanistan before 9/11, and that some of these were future 9/11 hijackers. There also is circumstantial evidence that senior Hezbollah operatives were closely tracking the travel of some of these future muscle hijackers into Iran in November 2000."
The report said of Iran training al Qaeda that "In late 1991 or 1992, discussions in Sudan between al Qaeda and Iranian operatives led to an informal agreement to cooperate in providing support -- even if only training -- for actions carried out primarily against Israel and the United States. Not long afterward, senior al Qaeda operatives and trainers traveled to Iran to receive training in explosives . . . The relationship between al Qaeda and Iran demonstrated that Sunni-Shia divisions did not necessarily pose an insurmountable barrier to cooperation in terrorist operations."
You won't be surprised, then, to learn that the weekend before Mahmoud arrived at Columbia, foreign ministers of Iran and Saudi Arabia met to "stress the need for unity among world Muslims, and called for vigilance in the face of plots hatched by enemies to sow discord among the Shiite and Sunnite Muslims." No, it didn't come up in the "debate."
On my train ride home from Mahmoudapalooza, I spoke briefly with a Columbia University grad steeped in the Ivy League haze of non-judgment. She was upset and embarrassed -- not by Columbia president Lee Bollinger's bone-headed decision to legitimize Ahmadinejad at its World Leaders Forum. No, she was mortified that Bollinger had delivered his face-saving introduction challenging Ahmadinejad.
With childlike naivete, this Columbia alum told me: "I'm frightened by the polarity." Which about sums up the majority view of academia and the Ahmadinejad excusers on the left: They are more afraid of standing up and calling out evil than losing the West, their country and their own lives to it.
Wednesday, September 26, 2007
When my children are grown, I can tell them where I was when bloodthirsty Iranian thug-in-chief Mahmoud Ahmadinejad dared to disgrace Columbia University with his presence. I was standing with Jewish leaders, Iranian-American dissidents, World War II veterans and other concerned citizens, young and old, taking a stand against evil outside the campus gates.
Banafsheh Zand-Bonazzi, an Iranian-born activist whose dissident journalist father is jailed in her homeland, was appalled at the ignorance and moral equivalence of the leftists who paraded in front of the TV cameras with their Bush-is-a-terrorist paraphernalia. A few goons held a large banner that read: "Ahmadinejad is bad. Bush is worse."
"It's not always about Bush," Zand-Bonazzi exclaimed after schooling the Ahmadinejad apologists and pointing out fellow Iranian protesters holding signs memorializing persecuted and executed countrymen. The ANSWER mobsters, she fumed, "have their history wrong. They don't see the greater threat. They don't get it."
Rabbi Avi Weiss, a Jewish Orthodox leader from the Bronx, gets it. Standing amid a small but sturdy sea of "Hitler lives" and "Never forget" placards, Rabbi Weiss told me: "The First Amendment means you have the right to invite in the arch-terrorists of the world. It doesn't mean that you are obligated to do so -- especially when this whole visit was initiated by the Iranian mission, and Iranian missions around the world are known to have fomented and orchestrated in the communities where they are." Instead of being feted, Rabbi Weiss said, "this man, who is responsible for contributing to the killing of American troops in Iraq, should be served with papers and hauled into court."
Several anti-Ahmadinejad protesters expressed disappointment that a larger crowd had not turned out in New York City. I concur. Ahmadinejad's nuclear ambitions, Mahdi devotion, Jew hatred, Holocaust denial, human rights repression and American troop-murdering machinery threaten us all. Not just Jews. Not just persecuted Persian activists. Not just military families.
Immediately before landing in the Big Apple, the Iranian madman was grandmaster of a military parade in Tehran punctuated with "Death to America" and "Death to Israel" posters. Newsflash: It's not an either/or death wish.
Lost in the debate over the Columbia "debate" are the jumbo-sized jihadi dots connecting Iran to global Islamic terrorism, including 9/11. The 9/11 Commission Report stated in a section on Iran and the 1996 Khobar Towers bombing that "the evidence of Iranian involvement is strong."
On Iran and al Qaeda partnerships, the report concluded, "there is strong evidence that Iran facilitated the transit of al Qaeda members into and out of Afghanistan before 9/11, and that some of these were future 9/11 hijackers. There also is circumstantial evidence that senior Hezbollah operatives were closely tracking the travel of some of these future muscle hijackers into Iran in November 2000."
The report said of Iran training al Qaeda that "In late 1991 or 1992, discussions in Sudan between al Qaeda and Iranian operatives led to an informal agreement to cooperate in providing support -- even if only training -- for actions carried out primarily against Israel and the United States. Not long afterward, senior al Qaeda operatives and trainers traveled to Iran to receive training in explosives . . . The relationship between al Qaeda and Iran demonstrated that Sunni-Shia divisions did not necessarily pose an insurmountable barrier to cooperation in terrorist operations."
You won't be surprised, then, to learn that the weekend before Mahmoud arrived at Columbia, foreign ministers of Iran and Saudi Arabia met to "stress the need for unity among world Muslims, and called for vigilance in the face of plots hatched by enemies to sow discord among the Shiite and Sunnite Muslims." No, it didn't come up in the "debate."
On my train ride home from Mahmoudapalooza, I spoke briefly with a Columbia University grad steeped in the Ivy League haze of non-judgment. She was upset and embarrassed -- not by Columbia president Lee Bollinger's bone-headed decision to legitimize Ahmadinejad at its World Leaders Forum. No, she was mortified that Bollinger had delivered his face-saving introduction challenging Ahmadinejad.
With childlike naivete, this Columbia alum told me: "I'm frightened by the polarity." Which about sums up the majority view of academia and the Ahmadinejad excusers on the left: They are more afraid of standing up and calling out evil than losing the West, their country and their own lives to it.
Global Warming Hysteria
By Walter E. Williams
Wednesday, September 26, 2007
Despite increasing evidence that man-made CO2 is not a significant greenhouse gas and contributor to climate change, politicians and others who wish to control our lives must maintain that it is.
According to the Detroit Free Press, Rep. John Dingell wants a 50-cents-a-gallon tax on gasoline. We've heard such calls before, but there's a new twist. Dingell also wants to eliminate the mortgage tax deduction on what he calls "McMansions," homes that are 3,000 square feet and larger. That's because larger homes use more energy.
One might wonder about Dingell's magnanimity in increasing taxes for only homes 3,000 feet or larger. The average U.S. home is around 2,300 square feet, compared with Europe's average of 1,000 square feet. So why doesn't Dingell call for disallowing mortgage deductions on houses more than 1,000 square feet? The reason is there would be too much political resistance, since more Americans own homes under 3,000 square feet than over 3,000. The full agenda is to start out with 3,000 square feet and later lower it in increments.
Our buying into global warming hysteria will allow politicians to do just about anything, upon which they can muster a majority vote, in the name of fighting climate change as a means to raise taxes.
In addition to excuses to raise taxes, congressmen are using climate change hysteria to funnel money into their districts. Rep. David L. Hobson, R-Ohio, secured $500,000 for a geothermal demonstration project. Rep. Adam B. Schiff, D-Calif., got $500,000 for a fuel-cell project by Superprotonic, a Pasadena company started by Caltech scientists. Money for similar boondoggles is being called for by members of both parties.
There are many ways to reduce CO2 emissions, and being 71 years of age I know many of them. Al Gore might even consider me carbon neutral and possibly having carbon credits because my carbon offsets were made in advance. For example, for the first 15 years of my life, I didn't use energy-consuming refrigerators; we had an icebox. For two decades I listened to radio instead of watching television and walked or used public transportation to most places. And for more than half my life I didn't use energy-consuming things such as computers, clothes dryers, air conditioning and microwave ovens. Of course, my standard of living was much lower.
The bottom line is, serious efforts to reduce CO2 will lead to lower living standards through higher costs of living. And it will be all for naught because there is little or no relationship between man-made CO2 emissions and climate change.
There's an excellent booklet available from the National Center for Policy Analysis (ncpa.org) titled "A Global Warming Primer." Some of its highlights are:
"Over long periods of time, there is no close relationship between CO2 levels and temperature."
"Humans contribute approximately 3.4 percent of annual CO2 levels" compared to 96.6 percent by nature.
"There was an explosion of life forms 550 million years ago (Cambrian Period) when CO2 levels were 18 times higher than today. During the Jurassic Period, when dinosaurs roamed the Earth, CO2 levels were as much as nine times higher than today."
What about public school teachers frightening little children with tales of cute polar bears dying because of global warming? The primer says, "Polar bear numbers increased dramatically from around 5,000 in 1950 to as many as 25,000 today, higher than any time in the 20th century." The primer gives detailed sources for all of its findings, and it supplies us with information we can use to stop politicians and their environmental extremists from doing a rope-a-dope on us.
Wednesday, September 26, 2007
Despite increasing evidence that man-made CO2 is not a significant greenhouse gas and contributor to climate change, politicians and others who wish to control our lives must maintain that it is.
According to the Detroit Free Press, Rep. John Dingell wants a 50-cents-a-gallon tax on gasoline. We've heard such calls before, but there's a new twist. Dingell also wants to eliminate the mortgage tax deduction on what he calls "McMansions," homes that are 3,000 square feet and larger. That's because larger homes use more energy.
One might wonder about Dingell's magnanimity in increasing taxes for only homes 3,000 feet or larger. The average U.S. home is around 2,300 square feet, compared with Europe's average of 1,000 square feet. So why doesn't Dingell call for disallowing mortgage deductions on houses more than 1,000 square feet? The reason is there would be too much political resistance, since more Americans own homes under 3,000 square feet than over 3,000. The full agenda is to start out with 3,000 square feet and later lower it in increments.
Our buying into global warming hysteria will allow politicians to do just about anything, upon which they can muster a majority vote, in the name of fighting climate change as a means to raise taxes.
In addition to excuses to raise taxes, congressmen are using climate change hysteria to funnel money into their districts. Rep. David L. Hobson, R-Ohio, secured $500,000 for a geothermal demonstration project. Rep. Adam B. Schiff, D-Calif., got $500,000 for a fuel-cell project by Superprotonic, a Pasadena company started by Caltech scientists. Money for similar boondoggles is being called for by members of both parties.
There are many ways to reduce CO2 emissions, and being 71 years of age I know many of them. Al Gore might even consider me carbon neutral and possibly having carbon credits because my carbon offsets were made in advance. For example, for the first 15 years of my life, I didn't use energy-consuming refrigerators; we had an icebox. For two decades I listened to radio instead of watching television and walked or used public transportation to most places. And for more than half my life I didn't use energy-consuming things such as computers, clothes dryers, air conditioning and microwave ovens. Of course, my standard of living was much lower.
The bottom line is, serious efforts to reduce CO2 will lead to lower living standards through higher costs of living. And it will be all for naught because there is little or no relationship between man-made CO2 emissions and climate change.
There's an excellent booklet available from the National Center for Policy Analysis (ncpa.org) titled "A Global Warming Primer." Some of its highlights are:
"Over long periods of time, there is no close relationship between CO2 levels and temperature."
"Humans contribute approximately 3.4 percent of annual CO2 levels" compared to 96.6 percent by nature.
"There was an explosion of life forms 550 million years ago (Cambrian Period) when CO2 levels were 18 times higher than today. During the Jurassic Period, when dinosaurs roamed the Earth, CO2 levels were as much as nine times higher than today."
What about public school teachers frightening little children with tales of cute polar bears dying because of global warming? The primer says, "Polar bear numbers increased dramatically from around 5,000 in 1950 to as many as 25,000 today, higher than any time in the 20th century." The primer gives detailed sources for all of its findings, and it supplies us with information we can use to stop politicians and their environmental extremists from doing a rope-a-dope on us.
Columbia's Conceit
Exactly what would it have accomplished to "engage in a debate" with Hitler?
By Bret Stephens
Tuesday, September 25, 2007 12:01 a.m.
On Saturday John Coatsworth, acting dean of Columbia University's School of International and Public Affairs, made the remark that "if Hitler were in the United States and . . . if he were willing to engage in a debate and a discussion to be challenged by Columbia students and faculty, we would certainly invite him." This was by way of defending the university's decision to host a speech yesterday by Iranian President Mahmoud Ahmadinejad.
An old rule of thumb in debate tournaments is that the first one to say "Hitler" loses. But say what you will about Mr. Coatsworth's comment, it is, at bottom, a philosophical claim: about the purposes of education; about the uses of dialogue; about the obligations of academia; about the boundaries (or absence of boundaries) of modern liberalism and about its conceits. So rather than dismiss the claim out of hand, let's address it in the same philosophical spirit in which it was offered.
A few preliminaries: When Mr. Coatsworth postulated Hitler's visit, he specified the year 1939, just prior to Germany's invasion of Poland and the beginning of World War II. This, then, is not yet the Hitler of Auschwitz, though it is the Hitler of Dachau, the Nuremberg Laws, Guernica and Kristallnacht. Mr. Coatsworth takes the optimistic view that "an appearance by Hitler at Columbia could have led him to appreciate what a great power the U.S. had already become," and thus, presumably, kept America from war.
Less clear is whether Mr. Coatsworth issued his invitation in the name of Columbia's current faculty or on behalf the faculty of the 1930s or '40s. We'll assume the answer is the current faculty, since it's unlikely that a committee led by Jacques Barzun, Mark van Doren, Lionel Trilling or other Columbia luminaries of the day would have had much use for "discussion" with the Führer (though it seems Columbia hosted a speech by Hans Luther, Hitler's U.S. ambassador, in 1933).
What, then, would be the purpose of such an invitation? Columbia's president, Lee Bollinger, offered a clue in a statement issued last week: "Columbia, as a community dedicated to learning and scholarship, is committed to confronting ideas--to understand the world as it is and as it might be," he said. "Necessarily, on occasion this will bring us into contact with beliefs many, most or even all of us will find offensive and even odious. We trust our community, including our students, to be fully capable of dealing with these occasions, through dialogue and reason."
That's an interesting thought, coming from a man who won't countenance an ROTC program on campus. But leave that aside. What's more important is the question of how Columbia defines the set of ideas it believes are worth "confronting," whether its confidence in "dialogue and reason" is well placed and, finally, whether confronting ideas is a sufficient condition for understanding the world.
In a March 1952 essay in Commentary magazine on "George Orwell and the Politics of Truth," Trilling observed that "the gist of Orwell's criticism of the liberal intelligentsia was that they refused to understand the conditioned way of life." Orwell, he wrote, really knew what it was like to live under a totalitarian regime--unlike, say, George Bernard Shaw, who had "insisted upon remaining sublimely unaware of the Russian actuality," or H.G. Wells, who had "pooh-poohed the threat of Hitler." By contrast, Orwell "had the simple courage to point out that the pacifists preached their doctrine under condition of the protection of the British navy, and that, against Germany and Russia, Gandhi's passive resistance would have been to no avail."
Trilling took the point a step further, assailing the intelligentsia's habit of treating politics as a "nightmare abstraction" and "pointing to the fearfulness of the nightmare as evidence of their sense of reality." To put this in the context of Mr. Coatsworth's hypothetical, Trilling might have said that in hosting and perhaps debating Hitler, Columbia's faculty and students would not have been "confronting" him, much as they might have gulled themselves into believing they were. Hitler at Columbia would merely have been a man at a podium, offering his "ideas" on this or that, and not the master of a huge terror apparatus bearing down on you. To suggest that such an event amounts to a confrontation, or offers a perspective on reality, is a bit like suggesting that one "confronts" a wild animal by staring at it through its cage at a zoo.
There is also the question of just what ideas would be presented by Hitler at Mr. Coatsworth's hypothetical conference, and whether they would be an accurate reflection of his beliefs and intentions. In his 1933 speech, Ambassador Luther made the case for Hitler's "peaceful intentions" in Europe, according to historian Rafael Medoff. Millions of Europeans believed this right up to September 1939, just as millions of Americans did right up to December 1941.
Let's assume, however, that Hitler had used the occasion of his speech not just to dissimulate but to really air his mind, to give vent not just to Germany's historical grievances but to his own apocalyptic ambitions. In "Terror and Liberalism" (2003), Columbia alumnus Paul Berman observes the way in which prewar French socialists--keenly aware and totally opposed to Hitler's platform--nonetheless took the view that Germany had to be accommodated and that the real threat to peace came from their own "warmongers and arms manufacturers." This notion, Mr. Berman writes, rested in turn on a philosophical belief that "even the enemies of reason cannot be the enemies of reason. Even the unreasonable must be, in some fashion, reasonable."
So there is Adolf Hitler on our imagined stage, ranting about the soon-to-be-fulfilled destiny of the Aryan race. And his audience of outstanding Columbia men are mostly appalled, as they should be. But they are also engrossed, and curious, and if it occurs to some of them that the man should be arrested on the spot they don't say it. Nor do they ask, "How will we come to terms with his world?" Instead, they wonder how to make him see "reason," as reasonable people do.
In just a few years, some of these men will be rushing a beach at Normandy or caught in a firefight in the Ardennes. And the fact that their ideas were finer and better than Hitler's will have done nothing to keep them and millions of their countrymen from harm, and nothing to get them out of its way.
By Bret Stephens
Tuesday, September 25, 2007 12:01 a.m.
On Saturday John Coatsworth, acting dean of Columbia University's School of International and Public Affairs, made the remark that "if Hitler were in the United States and . . . if he were willing to engage in a debate and a discussion to be challenged by Columbia students and faculty, we would certainly invite him." This was by way of defending the university's decision to host a speech yesterday by Iranian President Mahmoud Ahmadinejad.
An old rule of thumb in debate tournaments is that the first one to say "Hitler" loses. But say what you will about Mr. Coatsworth's comment, it is, at bottom, a philosophical claim: about the purposes of education; about the uses of dialogue; about the obligations of academia; about the boundaries (or absence of boundaries) of modern liberalism and about its conceits. So rather than dismiss the claim out of hand, let's address it in the same philosophical spirit in which it was offered.
A few preliminaries: When Mr. Coatsworth postulated Hitler's visit, he specified the year 1939, just prior to Germany's invasion of Poland and the beginning of World War II. This, then, is not yet the Hitler of Auschwitz, though it is the Hitler of Dachau, the Nuremberg Laws, Guernica and Kristallnacht. Mr. Coatsworth takes the optimistic view that "an appearance by Hitler at Columbia could have led him to appreciate what a great power the U.S. had already become," and thus, presumably, kept America from war.
Less clear is whether Mr. Coatsworth issued his invitation in the name of Columbia's current faculty or on behalf the faculty of the 1930s or '40s. We'll assume the answer is the current faculty, since it's unlikely that a committee led by Jacques Barzun, Mark van Doren, Lionel Trilling or other Columbia luminaries of the day would have had much use for "discussion" with the Führer (though it seems Columbia hosted a speech by Hans Luther, Hitler's U.S. ambassador, in 1933).
What, then, would be the purpose of such an invitation? Columbia's president, Lee Bollinger, offered a clue in a statement issued last week: "Columbia, as a community dedicated to learning and scholarship, is committed to confronting ideas--to understand the world as it is and as it might be," he said. "Necessarily, on occasion this will bring us into contact with beliefs many, most or even all of us will find offensive and even odious. We trust our community, including our students, to be fully capable of dealing with these occasions, through dialogue and reason."
That's an interesting thought, coming from a man who won't countenance an ROTC program on campus. But leave that aside. What's more important is the question of how Columbia defines the set of ideas it believes are worth "confronting," whether its confidence in "dialogue and reason" is well placed and, finally, whether confronting ideas is a sufficient condition for understanding the world.
In a March 1952 essay in Commentary magazine on "George Orwell and the Politics of Truth," Trilling observed that "the gist of Orwell's criticism of the liberal intelligentsia was that they refused to understand the conditioned way of life." Orwell, he wrote, really knew what it was like to live under a totalitarian regime--unlike, say, George Bernard Shaw, who had "insisted upon remaining sublimely unaware of the Russian actuality," or H.G. Wells, who had "pooh-poohed the threat of Hitler." By contrast, Orwell "had the simple courage to point out that the pacifists preached their doctrine under condition of the protection of the British navy, and that, against Germany and Russia, Gandhi's passive resistance would have been to no avail."
Trilling took the point a step further, assailing the intelligentsia's habit of treating politics as a "nightmare abstraction" and "pointing to the fearfulness of the nightmare as evidence of their sense of reality." To put this in the context of Mr. Coatsworth's hypothetical, Trilling might have said that in hosting and perhaps debating Hitler, Columbia's faculty and students would not have been "confronting" him, much as they might have gulled themselves into believing they were. Hitler at Columbia would merely have been a man at a podium, offering his "ideas" on this or that, and not the master of a huge terror apparatus bearing down on you. To suggest that such an event amounts to a confrontation, or offers a perspective on reality, is a bit like suggesting that one "confronts" a wild animal by staring at it through its cage at a zoo.
There is also the question of just what ideas would be presented by Hitler at Mr. Coatsworth's hypothetical conference, and whether they would be an accurate reflection of his beliefs and intentions. In his 1933 speech, Ambassador Luther made the case for Hitler's "peaceful intentions" in Europe, according to historian Rafael Medoff. Millions of Europeans believed this right up to September 1939, just as millions of Americans did right up to December 1941.
Let's assume, however, that Hitler had used the occasion of his speech not just to dissimulate but to really air his mind, to give vent not just to Germany's historical grievances but to his own apocalyptic ambitions. In "Terror and Liberalism" (2003), Columbia alumnus Paul Berman observes the way in which prewar French socialists--keenly aware and totally opposed to Hitler's platform--nonetheless took the view that Germany had to be accommodated and that the real threat to peace came from their own "warmongers and arms manufacturers." This notion, Mr. Berman writes, rested in turn on a philosophical belief that "even the enemies of reason cannot be the enemies of reason. Even the unreasonable must be, in some fashion, reasonable."
So there is Adolf Hitler on our imagined stage, ranting about the soon-to-be-fulfilled destiny of the Aryan race. And his audience of outstanding Columbia men are mostly appalled, as they should be. But they are also engrossed, and curious, and if it occurs to some of them that the man should be arrested on the spot they don't say it. Nor do they ask, "How will we come to terms with his world?" Instead, they wonder how to make him see "reason," as reasonable people do.
In just a few years, some of these men will be rushing a beach at Normandy or caught in a firefight in the Ardennes. And the fact that their ideas were finer and better than Hitler's will have done nothing to keep them and millions of their countrymen from harm, and nothing to get them out of its way.
Intolerance in the name of tolerance
By Cal Thomas
Tuesday, September 25, 2007
I would not be as bothered by Columbia University's decision to host Iran's President Mahmoud Ahmadinejad if Columbia and other universities had a consistent policy toward those they invite to speak and the rules applied equally to conservatives and liberals; to totalitarian dictators and to advocates for freedom and tolerance.
Any conservative who has ever tried, or actually succeeded, in speaking on the campus of predominately liberal academic institutions knows it can resemble to some extent the struggle experienced by African Americans when they attempted to desegregate lunch counters in the South during the Civil Rights Movement.
In the 1980s, I spoke at universities from Smith College in the East to the University of California at Davis in the West. At Smith, lesbians sat in the front row kissing each other while the rest of the crowd shouted so loud no one could hear me (NPR's Nina Totenberg witnessed the riotous behavior, prompting me to remark, "I hope you're getting this on tape, Nina, because this is what liberals mean by tolerance.").
Former U.S. News and World Report columnist John Leo has been among the chroniclers of the demise of free speech on many college campuses. Writing in last winter's issue of the publication City Journal, Leo noted that Columbia University officials prevented a large crowd from hearing Walid Shoebat, a former PLO terrorist who is now an anti-jihadist. The reason given was security, which as Leo pointed out is a frequent excuse for restricting speech. Had Shoebat remained a PLO terrorist, Columbia might have allowed the students in, because anti-Jewish rhetoric of the kind Ahmadinejad delivers always seems welcome on too many campuses. Only Columbia students and 20 guests were allowed to hear Shoebat speak.
Why would Columbia expect Ahmadinejad to answer what they promised in advance would be "tough" questions? Have they not seen him interviewed by America's best reporters? He doesn't answer questions. He uses the interviews to lecture America and make his propaganda points. The exercise is useless, except to him because he scores points at home for "standing up to 'the Great Satan,' or whatever the preferred term du jour for the United States is at the moment.
Last October at Columbia, a mob of students stormed a stage, curtailing speeches by two members of the anti-illegal immigration group known as the Minutemen. The students shouted "They have no right to speak," which was revealing, given the "academic freedom" argument that is used to defend liberal professors and their frequent anti-American rants when conservatives attempt to shut them up.
As John Leo wrote, "Campus opponents of (Rep.) Tom Tancredo, an illegal immigration foe, set off fire alarms at Georgetown to disrupt his planned speech, and their counterparts at Michigan State roughed up his student backers. Conservative activist David Horowitz, black conservative Star Parker, and Daniel Pipes, an outspoken critic of Islamism, frequently find themselves shouted down or disrupted on campus." The number of instances involving censorship of conservatives on college campuses and denial of honorary degrees to people who don't toe the liberal line could fill a book.
There is something else about Columbia's decision to admit Ahmadinejad and that is the notion that by exposing a tyrant and religious fanatic to a liberal arts campus - a man who believes he has been "called" to usher in Armageddon - might make him less genocidal and students and the rest of us more understanding. We understand he and his legion of murdering thugs wish to kill us and are contributing to the death of Americans in Iraq. What part of mass murder do they not understand at Columbia, or don't they have time to study history these days?
Ahmadinejad is probably using his visit to case our country, like a bank robber does before a big heist.
Before we allow more of our enemies into America and give them a freedom unknown in their own countries, we should at least demand reciprocity. Their president gets to speak in America? Our president gets to speak in Iran. Their president has access to our media? Our president should have access to their media. And while we're at it, how about for every liberal who gets to speak on campus, the school must also invite a conservative.
Tuesday, September 25, 2007
I would not be as bothered by Columbia University's decision to host Iran's President Mahmoud Ahmadinejad if Columbia and other universities had a consistent policy toward those they invite to speak and the rules applied equally to conservatives and liberals; to totalitarian dictators and to advocates for freedom and tolerance.
Any conservative who has ever tried, or actually succeeded, in speaking on the campus of predominately liberal academic institutions knows it can resemble to some extent the struggle experienced by African Americans when they attempted to desegregate lunch counters in the South during the Civil Rights Movement.
In the 1980s, I spoke at universities from Smith College in the East to the University of California at Davis in the West. At Smith, lesbians sat in the front row kissing each other while the rest of the crowd shouted so loud no one could hear me (NPR's Nina Totenberg witnessed the riotous behavior, prompting me to remark, "I hope you're getting this on tape, Nina, because this is what liberals mean by tolerance.").
Former U.S. News and World Report columnist John Leo has been among the chroniclers of the demise of free speech on many college campuses. Writing in last winter's issue of the publication City Journal, Leo noted that Columbia University officials prevented a large crowd from hearing Walid Shoebat, a former PLO terrorist who is now an anti-jihadist. The reason given was security, which as Leo pointed out is a frequent excuse for restricting speech. Had Shoebat remained a PLO terrorist, Columbia might have allowed the students in, because anti-Jewish rhetoric of the kind Ahmadinejad delivers always seems welcome on too many campuses. Only Columbia students and 20 guests were allowed to hear Shoebat speak.
Why would Columbia expect Ahmadinejad to answer what they promised in advance would be "tough" questions? Have they not seen him interviewed by America's best reporters? He doesn't answer questions. He uses the interviews to lecture America and make his propaganda points. The exercise is useless, except to him because he scores points at home for "standing up to 'the Great Satan,' or whatever the preferred term du jour for the United States is at the moment.
Last October at Columbia, a mob of students stormed a stage, curtailing speeches by two members of the anti-illegal immigration group known as the Minutemen. The students shouted "They have no right to speak," which was revealing, given the "academic freedom" argument that is used to defend liberal professors and their frequent anti-American rants when conservatives attempt to shut them up.
As John Leo wrote, "Campus opponents of (Rep.) Tom Tancredo, an illegal immigration foe, set off fire alarms at Georgetown to disrupt his planned speech, and their counterparts at Michigan State roughed up his student backers. Conservative activist David Horowitz, black conservative Star Parker, and Daniel Pipes, an outspoken critic of Islamism, frequently find themselves shouted down or disrupted on campus." The number of instances involving censorship of conservatives on college campuses and denial of honorary degrees to people who don't toe the liberal line could fill a book.
There is something else about Columbia's decision to admit Ahmadinejad and that is the notion that by exposing a tyrant and religious fanatic to a liberal arts campus - a man who believes he has been "called" to usher in Armageddon - might make him less genocidal and students and the rest of us more understanding. We understand he and his legion of murdering thugs wish to kill us and are contributing to the death of Americans in Iraq. What part of mass murder do they not understand at Columbia, or don't they have time to study history these days?
Ahmadinejad is probably using his visit to case our country, like a bank robber does before a big heist.
Before we allow more of our enemies into America and give them a freedom unknown in their own countries, we should at least demand reciprocity. Their president gets to speak in America? Our president gets to speak in Iran. Their president has access to our media? Our president should have access to their media. And while we're at it, how about for every liberal who gets to speak on campus, the school must also invite a conservative.
Our Crazy Health-Insurance System
By John Stossel
Tuesday, September 25, 2007
Almost daily, we're bombarded with apocalyptic warnings about the 47 million Americans who have no health insurance. Sen. Hillary Clinton wants to require everyone to have it, big companies to pay for it and government to buy it for the poor.
That is a move in the wrong direction.
America's health-care problem is not that some people lack insurance -- it's that 250 million Americans do have it.
You have to understand something right from the start. We Americans got hooked on health insurance because the government did the insurance companies a favor during World War II. Wartime wage controls prohibited cash raises, so employers started giving noncash benefits, like health insurance, to attract workers. The tax code helped this along by treating employer-based health insurance more favorably than coverage you buy yourself. And state governments have made things worse by mandating coverage many people would never buy for themselves.
Competition also pushed companies to offer ever-more attractive policies, such as first-dollar coverage for routine ailments, like ear infections and colds, and coverage for things that are not even illnesses, like pregnancy. We came to expect insurance to cover everything.
That's the root of our problem. No one wants to pay for his own medical care. "Let the insurance company pay for it." But if companies pay, they will demand a say in what treatment is -- and is not -- permitted. Who can blame them?
And who can blame people for feeling frustrated that they aren't in control of their medical care? Maybe we need to rethink how we pay for less-than-catastrophic illnesses so people can regain control. The system creates perverse incentives for everyone. Government mandates are good at doing things like that.
Steering people to buy lots of health insurance is bad policy. Insurance is a necessary evil. We need it to protect us from the big risks -- things most of us can't afford to pay for, like a serious illness, a major car accident or a house fire.
But insurance is a lousy way to pay for things. Your premiums go not just to pay for medical care but also for fraud, paperwork and insurance-company employee salaries. This is bad for you and bad for doctors.
The average American doctor now spends 14 percent of his income on insurance paperwork. A North Carolina doctor we interviewed had to hire four people just to fill out forms. He wishes he could spend that money on caring for patients.
The paperwork is part of insurance companies' attempt to protect themselves against fraud. That's understandable. Many people do cheat. They lie about their history or demand money for unnecessary care or care that never even happened.
So there is a lot of waste in insurance -- lost money and time.
Imagine if your car insurance covered oil changes and gasoline. You wouldn't care how much gas you used, and you wouldn't care what it cost. Mechanics would sell you $100 oil changes. Prices would skyrocket.
That's how it works in health care. Patients don't ask how much a test or treatment will cost. They ask if their insurance covers it. They don't compare prices from different doctors and hospitals. (Prices do vary.) Why should they? They're not paying. (Although they do in hidden, indirect ways.)
In the end, we all pay more because no one seems to pay anything. It's why health insurance is not a good idea for anything but serious illnesses and accidents that could bankrupt you. For the rest, we should pay out of our savings.
Tuesday, September 25, 2007
Almost daily, we're bombarded with apocalyptic warnings about the 47 million Americans who have no health insurance. Sen. Hillary Clinton wants to require everyone to have it, big companies to pay for it and government to buy it for the poor.
That is a move in the wrong direction.
America's health-care problem is not that some people lack insurance -- it's that 250 million Americans do have it.
You have to understand something right from the start. We Americans got hooked on health insurance because the government did the insurance companies a favor during World War II. Wartime wage controls prohibited cash raises, so employers started giving noncash benefits, like health insurance, to attract workers. The tax code helped this along by treating employer-based health insurance more favorably than coverage you buy yourself. And state governments have made things worse by mandating coverage many people would never buy for themselves.
Competition also pushed companies to offer ever-more attractive policies, such as first-dollar coverage for routine ailments, like ear infections and colds, and coverage for things that are not even illnesses, like pregnancy. We came to expect insurance to cover everything.
That's the root of our problem. No one wants to pay for his own medical care. "Let the insurance company pay for it." But if companies pay, they will demand a say in what treatment is -- and is not -- permitted. Who can blame them?
And who can blame people for feeling frustrated that they aren't in control of their medical care? Maybe we need to rethink how we pay for less-than-catastrophic illnesses so people can regain control. The system creates perverse incentives for everyone. Government mandates are good at doing things like that.
Steering people to buy lots of health insurance is bad policy. Insurance is a necessary evil. We need it to protect us from the big risks -- things most of us can't afford to pay for, like a serious illness, a major car accident or a house fire.
But insurance is a lousy way to pay for things. Your premiums go not just to pay for medical care but also for fraud, paperwork and insurance-company employee salaries. This is bad for you and bad for doctors.
The average American doctor now spends 14 percent of his income on insurance paperwork. A North Carolina doctor we interviewed had to hire four people just to fill out forms. He wishes he could spend that money on caring for patients.
The paperwork is part of insurance companies' attempt to protect themselves against fraud. That's understandable. Many people do cheat. They lie about their history or demand money for unnecessary care or care that never even happened.
So there is a lot of waste in insurance -- lost money and time.
Imagine if your car insurance covered oil changes and gasoline. You wouldn't care how much gas you used, and you wouldn't care what it cost. Mechanics would sell you $100 oil changes. Prices would skyrocket.
That's how it works in health care. Patients don't ask how much a test or treatment will cost. They ask if their insurance covers it. They don't compare prices from different doctors and hospitals. (Prices do vary.) Why should they? They're not paying. (Although they do in hidden, indirect ways.)
In the end, we all pay more because no one seems to pay anything. It's why health insurance is not a good idea for anything but serious illnesses and accidents that could bankrupt you. For the rest, we should pay out of our savings.
Monday, September 24, 2007
Columbia and Ahmadinejad: The New Woodward and Bernstein
By Lisa De Pasquale
Monday, September 24, 2007
This week the once-esteemed Columbia University will host another speaker in its on-going “Conversations with Islamo-Fascists" series. I can hear the speaker’s introduction music now:
You can reach me by caravan,
Cross the desert like an Arab man
I don't care how you get here,
Just get here if you can
This is the second time that Columbia University has invited Iranian President Mahmoud Ahmadinejad to speak on their campus. His first speech was canceled because of security concerns. No, not security concerns over inviting a terrorist to an American university, but concerns that they couldn’t guarantee the safety of the terrorist.
Rest assured that Columbia doesn’t discriminate. In fact, they would also welcome European anti-Semitic mass murderers. Columbia Dean John Coatsworth told Fox News, “If Hitler were in the United States and wanted a platform in which to speak he would have plenty of platforms to speak in the United States. If he were willing to engage in debate and discussion to be challenged by Columbia students and faculty, we would certainly invite him.”
I can just imagine the hard-hitting questions the faculty will ask Ahmadinejad. I can think of at least one challenge that Columbia Professor Nicholas De Genova may have for him. At a 2003 “teach-in,” De Genova said, “I personally would like to see a million Mogadishus.” For the Columbia students that missed that lecture, De Genova was referring to the 1993 incident in which the bodies of American soldiers were dragged through the streets in Somalia.
De Genova would demand, "President Ahmadinejad, how many Mogadishus would you like to see? Answer the question!" If he says any number less than a million, he’s clearly a tool of Halliburton or Fox News. (I was never that good with following the radical left’s logic.)
It’s actually more likely that De Genova thinks Ahmadinejad is a hero. At the same teach-in, De Genova also said the “only true heroes are those who find ways that help defeat the U.S. military.”
In response to criticism for giving Ahmadinejad a forum, Columbia President Lee Bollinger said it was part of “Columbia’s long-standing tradition of serving as a major forum for robust debate.” The Ahmadinejad defenders tout the importance of free speech. Their defense is completely laughable because college campuses have become institutions with zero respect for free speech. This is like Sarah Brady touting the Second Amendment when she bought her 22-year-old son a rifle for Christmas.
The “free speech” defense doesn’t seem to extend to conservative speakers. In December 2001, three student groups at Columbia organized an independently-funded lecture by author Ann Coulter. I worked with the students in helping them schedule the lecture and secure funding. The day of the lecture, administrators changed the location of the room because of threats from liberal campus protestors. Nearly two-thirds of the audience didn’t find the room until after the lecture was finished. As the question and answer session began, protestors booed and shouted to the point where neither the speaker nor the questioner could hear one another. That’s the Left’s idea of a “dialogue.” It’s not enough that they not listen to opposing views, they must deny others the opportunity, too. Perhaps they should rename the department the Hugo Chavez School of Journalism.
Several weeks after the lecture, the student organizer called me in a panic because Columbia was threatening to withhold his degree because a bill for two security officers assigned to the lecture wasn’t paid. However, several witnesses, including the speaker, noted that there was not a security presence at the lecture. Perhaps campus security couldn’t find the new location either.
As any organization that sponsors conservative speakers on college campuses know, the security scam is a frequent dirty trick used by liberal administrators in order to intimidate students. As with Columbia, they tell the conservative groups that because our speakers are so “controversial,” the students must pay for additional security. Jason Mattera of Young America’s Foundation wrote about one of the most infamous cases of the security scam, “In 2000, when Charlton Heston was requested, student organizers were told they needed to pay for a bomb-sniffing dog, ten police officers, two full-body metal detectors, two metal detector wands, a paramedic team, and four pints of Mr. Heston’s blood type.”
Let’s not forget that the reason some conservative speakers need security is to protect them from violent liberals. In response to both Ann Coulter and Dinesh D’Souza being invited to speak on campus by conservative groups, a writer for The Columbia Spectator wrote, “Crackpots like D'Souza and Coulter should be afraid to open their mouths on a campus with such a proud left-wing history.”
Obviously, Mahmoud Ahmadinejad has nothing to be afraid of when he speaks at Columbia. Later that day he’ll participate in a videoconference at the National Press Club. In case there was any doubt, he would be safe there, too.
It’s obvious that the Columbia University administrators didn’t bother to listen to Coulter’s December 2001 speech, “Terrorism and Its Left-wing Sympathizers.” They should also brush up on the classics and read Dante’s Inferno. Here's a place to start: “The darkest places in hell are reserved for those who maintain their neutrality in times of moral crisis.”
Monday, September 24, 2007
This week the once-esteemed Columbia University will host another speaker in its on-going “Conversations with Islamo-Fascists" series. I can hear the speaker’s introduction music now:
You can reach me by caravan,
Cross the desert like an Arab man
I don't care how you get here,
Just get here if you can
This is the second time that Columbia University has invited Iranian President Mahmoud Ahmadinejad to speak on their campus. His first speech was canceled because of security concerns. No, not security concerns over inviting a terrorist to an American university, but concerns that they couldn’t guarantee the safety of the terrorist.
Rest assured that Columbia doesn’t discriminate. In fact, they would also welcome European anti-Semitic mass murderers. Columbia Dean John Coatsworth told Fox News, “If Hitler were in the United States and wanted a platform in which to speak he would have plenty of platforms to speak in the United States. If he were willing to engage in debate and discussion to be challenged by Columbia students and faculty, we would certainly invite him.”
I can just imagine the hard-hitting questions the faculty will ask Ahmadinejad. I can think of at least one challenge that Columbia Professor Nicholas De Genova may have for him. At a 2003 “teach-in,” De Genova said, “I personally would like to see a million Mogadishus.” For the Columbia students that missed that lecture, De Genova was referring to the 1993 incident in which the bodies of American soldiers were dragged through the streets in Somalia.
De Genova would demand, "President Ahmadinejad, how many Mogadishus would you like to see? Answer the question!" If he says any number less than a million, he’s clearly a tool of Halliburton or Fox News. (I was never that good with following the radical left’s logic.)
It’s actually more likely that De Genova thinks Ahmadinejad is a hero. At the same teach-in, De Genova also said the “only true heroes are those who find ways that help defeat the U.S. military.”
In response to criticism for giving Ahmadinejad a forum, Columbia President Lee Bollinger said it was part of “Columbia’s long-standing tradition of serving as a major forum for robust debate.” The Ahmadinejad defenders tout the importance of free speech. Their defense is completely laughable because college campuses have become institutions with zero respect for free speech. This is like Sarah Brady touting the Second Amendment when she bought her 22-year-old son a rifle for Christmas.
The “free speech” defense doesn’t seem to extend to conservative speakers. In December 2001, three student groups at Columbia organized an independently-funded lecture by author Ann Coulter. I worked with the students in helping them schedule the lecture and secure funding. The day of the lecture, administrators changed the location of the room because of threats from liberal campus protestors. Nearly two-thirds of the audience didn’t find the room until after the lecture was finished. As the question and answer session began, protestors booed and shouted to the point where neither the speaker nor the questioner could hear one another. That’s the Left’s idea of a “dialogue.” It’s not enough that they not listen to opposing views, they must deny others the opportunity, too. Perhaps they should rename the department the Hugo Chavez School of Journalism.
Several weeks after the lecture, the student organizer called me in a panic because Columbia was threatening to withhold his degree because a bill for two security officers assigned to the lecture wasn’t paid. However, several witnesses, including the speaker, noted that there was not a security presence at the lecture. Perhaps campus security couldn’t find the new location either.
As any organization that sponsors conservative speakers on college campuses know, the security scam is a frequent dirty trick used by liberal administrators in order to intimidate students. As with Columbia, they tell the conservative groups that because our speakers are so “controversial,” the students must pay for additional security. Jason Mattera of Young America’s Foundation wrote about one of the most infamous cases of the security scam, “In 2000, when Charlton Heston was requested, student organizers were told they needed to pay for a bomb-sniffing dog, ten police officers, two full-body metal detectors, two metal detector wands, a paramedic team, and four pints of Mr. Heston’s blood type.”
Let’s not forget that the reason some conservative speakers need security is to protect them from violent liberals. In response to both Ann Coulter and Dinesh D’Souza being invited to speak on campus by conservative groups, a writer for The Columbia Spectator wrote, “Crackpots like D'Souza and Coulter should be afraid to open their mouths on a campus with such a proud left-wing history.”
Obviously, Mahmoud Ahmadinejad has nothing to be afraid of when he speaks at Columbia. Later that day he’ll participate in a videoconference at the National Press Club. In case there was any doubt, he would be safe there, too.
It’s obvious that the Columbia University administrators didn’t bother to listen to Coulter’s December 2001 speech, “Terrorism and Its Left-wing Sympathizers.” They should also brush up on the classics and read Dante’s Inferno. Here's a place to start: “The darkest places in hell are reserved for those who maintain their neutrality in times of moral crisis.”
Subscribe to:
Posts (Atom)