Saturday, November 07, 2009

Armed And Female

There's nothing intrinsically masculine about guns: just look at Fort Hood's Sgt. Kimberly Munley.

November 7, 2009 - by Clayton E. Cramer

A recent article in the Telegraph discusses the rise of “ladies-only gun camps.” Why ladies-only? The article doesn’t say, but I know that similar training efforts have been sex-segregated because some women feel a bit intimidated by the inevitable “let me show you how it’s done, little lady” behavior that some guys exhibit — as if there’s something intrinsically masculine about shooting a gun.

This July 4, 2009 photo obtained Nov. 6, 2009 from the Twitter page of Sgt Kimberly Munley shows Sgt. Munley at Freedom Fest in Frisco, Texas. Officials say 34-year-old Munley ended the shooting spree at Fort Hood on Thursday, Nov. 5 when she shot and wounded alleged shooter Maj. Nidal Malik Hasan. Munley was wounded in the shooting, and was recovering Friday in stable condition. (AP Photo/via Sgt. Munley's Twitter page)

Of course, there isn’t. Nor should this be a surprise. The tragedy at Fort Hood was ended by a female police officer. What might have been an even bigger massacre in Colorado Springs two years ago was stopped because an armed woman named Jeanne Assam stopped a mass murderer with a rifle and 1,000 rounds of ammunition in the lobby of a church.

While gun ownership in America has traditionally been associated with militia duty — which men were required to do and from which women were excluded — throughout our history, at least some women have been armed and quite proficient. Unsurprisingly, the closer to the frontier you were, the more common this was. Henry Rowe Schoolcraft’s account of his travels in 1819 Arkansas describes his surprise at one small settlement where he attempted to engage the lady of the house and her daughters in polite conversation:

In the course of the evening I tried to engage our hostess and her daughters in small-talk, such as passes current in every social corner; but, for the first time, found I should not recommend myself in that way. They could only talk of bears, hunting, and the like. The rude pursuits, and the coarse enjoyments of the hunter state, were all they knew.

William C. Smith’s account of frontier Indiana during the War of 1812 describes how wives hunted:

Some of them could handle the rifle with great skill, and bring down the game in the absence of their husbands, especially when, as was often the case, the deer made their appearance near the cabin. They would have shot an Indian, if need be, without a moment’s hesitation.

Still, it does seem as though women have bought into the “guns are yucky” idea more than men in recent years — and that’s really quite surprising. When it comes to the traumatic personal crimes of violence, rapes outnumber murders about six to one — and the physical strength of men so exceeds that of women that if there is a gender identity to guns, we ought to think of guns as more of a feminine accessory.

Some years ago, my wife and I were living in California. We managed to persuade our police chief to issue concealed weapon permits after a robbery attempt. (If you live in California and have ever applied for such a permit, this should give you some idea of how persuasive we can be.) We went through the class taught at the police academy. While most of our class consisted of civilians, at least one of the students was a police officer in training who ended up in our class because of a motor vehicle accident. By the time we had completed our training, my wife and I had both dramatically improved our marksmanship (along with learning the laws concerning use of deadly force).

A few weeks later, we were visiting some friends in central Nevada. John was a military police officer at the Naval Air Station in Fallon. He was a nice guy, but he had a rather traditional view of women. We went out into the desert, and after doing some 500-meter rifle shooting, we set up targets for handguns. John, not surprisingly, thought rather highly of his marksmanship skills with the Colt Government Model .45 (still the U.S. military standard at the time). He shot a pretty decent group at 10 meters — all seven rounds in a circle about two inches across. My wife picked up the same gun, and when she was done, there was one large ragged hole. You could see John’s entire view of women catching fire and going up in smoke.

I’m not sure that there’s still a need for “ladies-only gun camps.”
Increasingly, a generation of young women is growing up in homes that encourage girls to learn to shoot. When my daughter and son both reached about eight years of age, my wife and I took each of them out to the range. We did this to show them the fearsome and destructive power of firearms. We also did so to take away the inevitable mystery associated with a tool that features so prominently in our entertainment media. A few years later, I took my daughter to submachine gun at Front Sight.

Guns aren’t a major part of the lives of either of our children, but they know how to safely handle guns and to shoot — sometimes leading to some very amusing consequences. When my daughter was away at the University of Idaho, the young man who is now her husband proposed going shooting on a date. Let’s just say that he was more than a bit startled at what a crack shot she was!

- Clayton E. Cramer is a software engineer and historian. His sixth book, Armed America: The Remarkable Story of How and Why Guns Became as American as Apple Pie (Nelson Current, 2006), is available in bookstores. His web site is

Legends of the fall

Friday, November 6, 2009

This was close to 3 a.m. on Thursday, long after Sinatra’s silky voice had drifted off into the night. The clubhouse floor was still soaked in champagne and outside, hundreds of Stadium workers were still cleaning up the Bronx’s best party in a decade. With nothing else to do, New York City cops were posing for pictures on the mound.

New York Yankees' Derek Jeter(notes), left, and Mariano Rivera(notes) look at the championship trophy after winning the Major League Baseball World Series against the Philadelphia Phillies Wednesday, Nov. 4, 2009, in New York. (AP)

So begins the precious hangover. It’ll take an entire off-season to fully appreciate the breadth of the Yankees’ accomplishments, but for now, the fans are having too much fun counting the long line of vanquished opponents. There’s the Twins and the Angels, of course, especially the Phillies.

Derek Jeter stood on the temporary podium at second base and reminded 50,000 fans how, “a lot of people were making predictions about this Series before it started.” The place roared its approval. Everyone knew Jeter was talking about Jimmy Rollins and the way he ran his mouth. Instead, Jeter chose to live by the time-honored credo that nothing fuels revenge more sweetly than living well.

But the Yankees’ hierarchy understands this championship season, seamless as it appeared, comes with a surcharge. Johnny Damon and Hideki Matsui might both be gone in 2010, and who knows where Joba Chamberlain’s trend line is headed?

If there’s any single factor that distinguishes the ’09 Yankees from their predecessors, it’s that Chamberlain’s failure forced them into an almost unheard-of three-man rotation throughout the postseason.

No championship club had relied on just three starters since the 1991 Twins. Back then it was Jack Morris, Scott Erickson and Kevin Tapani. This time around, CC Sabathia, A.J. Burnett and Andy Pettitte were all forced to work on short rest.

You could call it a resource failure and wonder how, after $204 million, the Yankees still couldn’t find a fourth starter they trusted. Or else it was just another way of mocking the industry. Even with three-quarters of a rotation, the Yankee still crushed everyone in their path.

Either way, the ’09 team will be remembered as one of the better champions in club history, although they fall a notch below the ’98 club. This ’09 edition did, in fact, struggle — all the way back to Game 2 of the AL Division Series, when the Twins were on the verge of winning, 3-1. If Joe Nathan had been able to preserve a ninth-inning lead, the Twins would’ve gone home tied, 1-1, with two of the final three games in the Metrodome.

But Nathan couldn’t hold down Alex Rodriguez, who tied the game with a two-run homer and sent it into extra innings. The Twins were never the same, just as the Angels were sunk in the ALCS after closer Brian Fuentes surrendered an 11th-inning homer to A-Rod in Game 2, leading the way to the Yankees’ 4-3 victory.

Brad Lidge was no better in the World Series; he failed in Game 4, the one that Johnny Damon literally stole in the ninth inning. At the moment of truth, when it was Lidge’s fastball against A-Rod’s bat speed, the Yankee slugger had an answer – a line-drive double into the left field corner that helped give the Bombers a 7-4 victory.

Those comebacks were forged by more than just talent. It was from the Yankees’ limitless belief in their own superiority. As one major league executive said with envy: “The difference was how much the Yankees believed they could come back and beat you in their last at-bat. That kind of confidence is un-quantifiable.”

The Yankees spent crazy money, of course. The haters will tell you the Bombers’ championship is soiled by the Steinbrenner family’s excess, which makes it hard for the average fan to buy into the “Win it for The Boss” mantra. But if the last decade has taught us anything, it’s that money is a guarantee of nothing: the Steinbrenners had spent more than $1 billion since 2000 without a championship to show for it.

It’s how wisely that money is spent; that’s the difference between winning and ending up like the Mets. And that was the enduring irony of Game 6, that for all the cash heaped upon last year’s free agents – Sabathia, Burnett and Mark Teixeira – it was the home grown Yankees who brought home the title.

Who else but Pettitte could’ve started this game? Who else but Mariano Rivera could’ve finished it? What better scene than Jeter hugging Jorge Posada one last time on the field as the postgame party raged on.

It was a perfect full-circle journey for this core, the same four players who were there in 1996. They’re older and (clearly) wiser now, more appreciative than ever of the precious gift of a championship. Pettitte, especially, was the right person for this ending, confounding the Phillies just as he almost always has in the big moments.

Remember only two days when Charlie Manuel dismissed the 37-year-old lefty as just another aging talent? Pettitte took no offense at the slap – he never does, he wasn’t born with a score-settling gene – but instead channeled his energy into working out of trouble.

Even now, a full decade later, Pettitte’s gift of making rallies disappear remains intact. Rollins conceded that much late Wednesday night, saying, “[Pettitte] made one great pitch every at-bat, it seemed like. When he needed that one pitch, he was able to make it. He got some double plays. That really shortened some innings. That’s what Andy does — he keeps his team in the game. You walk away shaking your head. But obviously, he’s pretty good.”

Pettitte walked off the mound to a thunderous standing ovation – a public thank-you for 12 years of greatness with the team. That was just one of a million memories that’ll remain with the Yankees and with their fans.

It was a compelling season and an October that left no doubt about its verdict: this time, the best team won.

Friday, November 06, 2009

Fort Hood's 9/11

Islamist terror strikes US again

New York Post
November 6, 2009

On Thursday afternoon, a radicalized Muslim US Army officer shouting "Allahu Akbar!" committed the worst act of terror on American soil since 9/11. And no one wants to call it an act of terror or associate it with Islam.

What cowards we are. Political correctness killed those patriotic Americans at Ft. Hood as surely as the Islamist gunman did. And the media treat it like a case of non-denominational shoplifting.

Army Lt. Gen. Robert Cone gives a news conference after a shooting in Fort Hood, Texas on Thursday, Nov. 5, 2009.(AP Photo/Austin American-Statesman, Rodolfo Gonzalez)

This was a terrorist act. When an extremist plans and executes a murderous plot against our unarmed soldiers to protest our efforts to counter Islamist fanatics, it’s an act of terror. Period.

When the terrorist posts anti-American hate-speech on the Web; apparently praises suicide bombers and uses his own name; loudly criticizes US policies; argues (as a psychiatrist, no less) with his military patients over the worth of their sacrifices; refuses, in the name of Islam, to be photographed with female colleagues; lists his nationality as "Palestinian" in a Muslim spouse-matching program, and parades around central Texas in a fundamentalist playsuit — well, it only seems fair to call this terrorist an "Islamist terrorist."

But the president won’t. Despite his promise to get to all the facts. Because there’s no such thing as "Islamist terrorism" in ObamaWorld.

And the Army won’t. Because its senior leaders are so sick with political correctness that pandering to America-haters is safer than calling terrorism "terrorism."

And the media won’t. Because they have more interest in the shooter than in our troops — despite their crocodile tears.

Maj. Nadal Malik Hasan planned this terrorist attack and executed it in cold blood. The resulting massacre was the first tragedy. The second was that he wasn’t killed on the spot.

Hasan survived. Now the rest of us will have to foot his massive medical bills. Activist lawyers will get involved, claiming "harassment" drove him temporarily insane. There’ll be no end of trial delays. At best, taxpayer dollars will fund his prison lifestyle for decades to come, since our politically correct Army leadership wouldn’t dare pursue or carry out the death penalty.

Maj. Hasan will be a hero to Islamist terrorists abroad and their sympathizers here. While US Muslim organizations decry his acts publicly, Hasan will be praised privately. And he’ll have the last laugh.

But Hasan isn’t the sole guilty party. The US Army’s unforgivable political correctness is also to blame for the casualties at Ft. Hood.

Given the myriad warning signs, it’s appalling that no action was taken against a man apparently known to praise suicide bombers and openly damn US policy. But no officer in his chain of command, either at Walter Reed Army Medical Center or at Ft. Hood, had the guts to take meaningful action against a dysfunctional soldier and an incompetent doctor.

Had Hasan been a Lutheran or a Methodist, he would’ve been gone with the simoon. But officers fear charges of discrimination when faced with misconduct among protected minorities.

Now 12 soldiers and a security guard lie dead. 31 soldiers were wounded, 28 of them seriously. If heads don’t roll in this maggot’s chain of command, the Army will have shamed itself beyond moral redemption.

There’s another important issue, too. How could the Army allow an obviously incompetent and dysfunctional psychiatrist to treat our troubled soldiers returning from war? An Islamist whacko is counseled for arguing with veterans who’ve been assigned to his care? And he’s not removed from duty? What planet does the Army live on?

For the first time since I joined the Army in 1976, I’m ashamed of its dereliction of duty. The chain of command protected a budding terrorist who was waving one red flag after another. Because it was safer for careers than doing something about him.

Get ready for the apologias. We’ve already heard from the terrorist’s family that "he’s a good American." In their world, maybe he is.

But when do we, the American public, knock off the PC nonsense?

A disgruntled Muslim soldier murdered his officers way back in 2003, in Kuwait, on the eve of Operation Iraqi Freedom. Recently? An American mullah shoots it out with the feds in Detroit. A Muslim fanatic attacks an Arkansas recruiting station. A Muslim media owner, after playing the peace card, beheads his wife. A Muslim father runs over his daughter because she’s becoming too Westernized.

Muslim terrorist wannabes are busted again and again. And we’re assured that "Islam’s a religion of peace."

I guarantee you that the Obama administration’s non-response to the Ft. Hood attack will mock the memory of our dead.

Ralph Peters’ latest novel is "The War After Armageddon."

Shooter exposes hole in U.S. terror strategy

Same ideological pathologies that drive al-Qaida overpowered Hasan's American identity.

By Mark Steyn
Syndicated columnist
Orange County Register
November 6, 2009

Thirteen dead and 28 wounded would be a bad day for the U.S. military in Afghanistan and a great victory for the Taliban. When it happens in Texas, in the heart of the biggest military base in the nation, at a processing center for soldiers either returning from or deploying to combat overseas, it is not merely a "tragedy" (as too many people called it) but a glimpse of a potentially fatal flaw at the heart of what we have called, since 9/11, the "war on terror." Brave soldiers trained to hunt down and kill America's enemy abroad were killed in the safety and security of home by, in essence, the same enemy – a man who believes in and supports everything the enemy does.

And he's a U.S. Army major.

Major Nidal Malik Hasan, the U.S. Army doctor identified by authorities as the suspect in a mass shooting at the U.S. Army post in Fort Hood, Texas, is seen in this undated handout photo from a pdf file of the U.S. Government Uniformed Services University of the Health Sciences downloaded on November 6, 2009. (Reuters)

And his superior officers and other authorities knew about his beliefs but seemed to think it was just a bit of harmless multicultural diversity – as if believing that "the Muslims should stand up and fight against the aggressor" (i.e., his fellow American soldiers) and writing Internet paeans to the "noble" "heroism" of suicide bombers and, indeed, objectively supporting the other side in an active war is to be regarded as just some kind of alternative lifestyle that adds to the general vibrancy of the base.

When it emerged early Thursday afternoon that the shooter was Nidal Malik Hasan, there appeared shortly thereafter on Twitter a flurry of posts with the striking formulation: "Please judge Maj. Malik Nadal [sic]by his actions and not by his name."

Concerned tweeters can relax: There was never really any danger of that – and not just in the sense that the New York Times' first report on Maj. Hasan never mentioned the words "Muslim" or "Islam," or that ABC's Martha Raddatz's only observation on his name was that "as for the suspect, Nadal Hasan, as one officer's wife told me, 'I wish his name was Smith.'"

What a strange reaction. I suppose what she means is that, if his name were Smith, we could all retreat back into the same comforting illusions that allowed the bureaucracy to advance Nidal Malik Hasan to major and into the heart of Fort Hood while ignoring everything that mattered about the essence of this man.

Since 9/11, we have, as the Twitterers, recommend, judged people by their actions – flying planes into skyscrapers, blowing themselves up in Bali nightclubs or London Tube trains, planting IEDs by the roadside in Baghdad or Tikrit. And on the whole we're effective at responding with action of our own.

But we're scrupulously nonjudgmental about the ideology that drives a man to fly into a building or self-detonate on the subway, and thus we have a hole at the heart of our strategy. We use rhetorical conveniences like "radical Islam" or, if that seems a wee bit Islamophobic, just plain old "radical extremism." But we never make any effort to delineate the line which separates "radical Islam" from nonradical Islam. Indeed, we go to great lengths to make it even fuzzier. And somewhere in that woozy blur the pathologies of a Nidal Malik Hasan incubate. An Army psychiatrist, Maj. Hasan is an American, born and raised, who graduated from Virginia Tech and then received his doctorate from the Uniformed Services University of the Health Sciences in Bethesda, Md. But he opposed America's actions in the Middle East and Afghanistan and made approving remarks about jihadists on U.S. soil. "You need to lock it up, Major," said his superior officer, Col. Terry Lee.

But he didn't really need to "lock it up" at all. He could pretty much say anything he liked, and if any "red flags" were raised they were quickly mothballed. Lots of people are "anti-war." Some of them are objectively on the other side – that's to say, they encourage and support attacks on American troops and civilians. But not many of those in that latter category are U.S. Army majors. Or so one would hope.

Yet why be surprised? Azad Ali, a man who approvingly quotes such observations as "If I saw an American or British man wearing a soldier's uniform inside Iraq I would kill him because that is my obligation" is an adviser to Britain's Crown Prosecution Service (the equivalent of U.S. attorneys). In Toronto this week, the brave ex-Muslim Nonie Darwish mentioned that, on flying from the U.S. to Canada, she was questioned at length about the purpose of her visit by an apparently Muslim border official. When she revealed that she was giving a speech about Islamic law, he rebuked her: "We are not to question Shariah."

That's the guy manning the airport security desk.

In the New York Times, Maria Newman touched on Hasan's faith only obliquely: "He was single, according to the records, and he listed no religious preference." Thank goodness for that, eh? A neighbor in Texas says the major had "Allah" and "another word" pinned up in Arabic on his door. "Akbar" maybe? On Thursday morning he is said to have passed out copies of the Quran to his neighbors. He shouted in Arabic as he fired.

But don't worry: As the FBI spokesman assured us in nothing flat, there's no terrorism angle.

That's true, in a very narrow sense: Maj. Hasan is not a card-carrying member of the Texas branch of al-Qaida reporting to a control officer in Yemen or Waziristan. If he were, things would be a lot easier. But the same pathologies that drive al-Qaida beat within Maj. Hasan, too, and in the end his Islamic impulses trumped his expensive Western education, his psychiatric training, his military discipline – his entire American identity.

What happened to those men and women at Fort Hood had a horrible symbolism: Members of the best-trained, best-equipped fighting force on the planet gunned down by a guy who said a few goofy things no one took seriously. And that's the problem: America has the best troops and fiercest firepower, but no strategy for throttling the ideology that drives the enemy – in Afghanistan and in Texas.


Divisive Unanimity

The Nation's Pulse

By Daniel J. Flynn on 11.6.09 @ 6:08AM
The American Spectator

On Tuesday, Maine's voters opted by a 53 to 47 percent majority to reject a gay marriage law passed by their state legislature and signed by their governor in May. The result has been easily overlooked amidst Republican victories in New Jersey and Virginia, and the first Democrat elected to Congress from an upstate New York district in more than a century. It shouldn't be.

Frank Schubert, campaign director for Stand for Marriage Maine, talks to supporters of Yes on 1, Tuesday evening, Nov. 3, 2009, in Portland, Maine. (AP Photo/Robert F. Bukaty)

Maine jumped off the New England gay marriage bandwagon driven by its parent state, Massachusetts, and subsequently ridden on by Vermont, New Hampshire, and Connecticut. Of greater significance, Maine refused to become the first state to give the ballot's imprimatur to homosexual marriage. From Alaska to Florida, North Dakota to Louisiana, Hawaii to Maine, the states have spoken with a united voice on gay marriage: no.

That's not the impression one gets from following the issue in the media. The very definition of consensus, traditional marriage is nevertheless couched in the language of controversy by loud voices seeking to undermine it. In the lead up to the referendum on gay marriage, ABC News called Maine's vote the "latest battle in the divisive fight over gay marriage." Earlier this week, as Portland, Maine readied to vote on gay marriage, KATU in Portland, Oregon, reported: "One of the most divisive issues in Oregon's history may be coming back to voters as the state's largest gay-rights group kicked off their campaign Monday to legalize same-sex marriage in the state." Even Barack Obama, who struck an ambiguous position on homosexual marriage as a candidate, nevertheless called California's Proposition 8 "divisive and discriminatory."
If there is anything approaching a unifying issue in American politics, it is marriage. Specifically, people want to preserve it as an institution involving one man and one woman. Gay marriage has been on the ballot in thirty-one states. Thirty one states have rejected it.

The usual suspects -- South Carolina, Mississippi, Utah -- have rejected gay marriage through ballot questions. But so, too, have reliably Democratic states, such as Michigan, Hawaii, and California. In Colorado, 56 percent of voters rejected gay marriage; in Ohio, 62 percent; and Missouri, 71 percent. These states are bellwethers, not outliers. As tempting as it is for supporters of homosexual marriage to paint the opposition as extremists, opposition to gay marriage is mainstream. To chalk up the defeats to "hate" is to place that label on most Americans, which is itself a kind of hatred.

A similar disconnect from reality is at work in interpreting a string of defeats worthy of the Washington Generals as proof of the inevitability of gay marriage. Shenna Bellows, executive director of the Maine Civil Liberties Union, told the Boston Globe after Maine's vote, "We are on the right side of history." But history is not a prediction of the imagined future. It's a chronicling of the experienced past. The past has always and everywhere rejected gay marriage when put to a popular vote in America. Just as the pitiful performance of gay marriage at the ballot box has supporters constructing an imaginary history, it has sympathizers eager to proclaim any sign of support for gay marriage in the present as a national reorientation on the issue. "That's a big cultural change," CNN legal analyst Jeffrey Toobin explained on election night in reaction to early tallies that suggested a victory for homosexual marriage in Maine. "Every time voters have spoken -- every time -- they have rejected gay marriage. But this shows the country is changing." To the contrary, the Maine vote demonstrated that unity, rather than division, continues as the status quo.

Certainly voters on both sides of the question are passionate about gay marriage. The intensity in Maine was so great that, despite no candidates for statewide or federal office on the ballot, sixty percent of the state's registered electorate voted -- a higher rate of participation than a dozen states exhibited in last year's presidential race. And as the passions get stoked, the passionate can get ugly, as post-Proposition 8 events proved in California: an assault on a 69-year-old woman holding a cross, racial taunts issued against African Americans as a result of black opposition to gay marriage, and white powder sent to Mormon churches.

But can a measure that has passed in every state in which it has been put before the voters be called divisive? Not with a straight face. Thirty-one for thirty-one isn't division. It's unanimity.

- Daniel J. Flynn, the author of A Conservative History of the American Left, blogs at

Jihad at Fort Hood

By Robert Spencer
November 6, 2007

Major Nidal Malik Hasan, a U.S. Army psychiatrist, murdered twelve people and wounded twenty-one inside Fort Hood in Texas yesterday, while, according to eyewitnesses, “shouting something in Arabic while he was shooting.” Investigators are scratching their heads and expressing puzzlement about why he did it. According to NPR, “the motive behind the shootings was not immediately clear, officials said.” The Washington Post agreed: “The motive remains unclear, although some sources reported the suspect is opposed to U.S. involvement in Afghanistan and Iraq and upset about an imminent deployment.” The Huffington Post spun faster, asserting that “there is no concrete reporting as to whether Nidal Malik Hasan was in fact a Muslim or an Arab.”

Yet there was, and what’s more, Major Hasan’s motive was perfectly clear — but it was one that the forces of political correctness and the Islamic advocacy groups in the United States have been working for years to obscure. So it is that now that another major jihad terror attack has taken place on American soil, authorities and the mainstream media are at a loss to explain why it happened – and the abundant evidence that it was a jihad attack is ignored.

Nidal Malik Hasan was born in Virginia but didn’t think of himself as an American: on a form he filled out at the Muslim Community Center in Silver Spring, Maryland, he gave his nationality not as “American” but as “Palestinian.” A mosque official found that curious, saying: “I don’t know why he listed Palestinian. He was not born in Palestine.”

He is a graduate of Virginia Tech and has a doctorate in psychiatry from the Uniformed Services University of the Health Sciences. While there, NPR reports, Hasan was “put on probation early in his postgraduate work” and was “disciplined for proselytizing about his Muslim faith with patients and colleagues.”

He was a staff psychiatrist at Walter Reed Army Medical Center for six years before transferring to Fort Hood earlier this year. While at Walter Reed, he was a “very devout” member of and daily visitor to the Muslim Community Center in Silver Spring. Faizul Khan, a former imam at the Center, expressed puzzlement over Hasan’s murders: “To know something like this happened, I don’t know what got into his mind. There was nothing extremist in his questions. He never showed any frustration….He never showed any remorse or wish for vengeance on anybody.”

So he identified himself as Palestinian and was a devout Muslim – so what? These things, of course, have no significance if one assumes that Islam is a Religion of Peace and that when a devout Muslim reads the Koran’s many injunctions to wage war against unbelievers, he knows that they have no force or applicability for today’s world. Unfortunately, all too many Muslims around the world demonstrate in both their words and their deeds that they take such injunctions quite seriously. And Nidal Hasan gave some indications that he may have been among them.

On May 20, 2009, a man giving his name as “NidalHasan” posted this defense of suicide bombing (all spelling and grammar as it is in the original):

There was a grenade thrown amongs a group of American soldiers. One of the soldiers, feeling that it was to late for everyone to flee jumped on the grave with the intention of saving his comrades. Indeed he saved them. He inentionally took his life (suicide) for a noble cause i.e. saving the lives of his soldier. To say that this soldier committed suicide is inappropriate. Its more appropriate to say he is a brave hero that sacrificed his life for a more noble cause. Scholars have paralled this to suicide bombers whose intention, by sacrificing their lives, is to help save Muslims by killing enemy soldiers. If one suicide bomber can kill 100 enemy soldiers because they were caught off guard that would be considered a strategic victory. Their intention is not to die because of some despair. The same can be said for the Kamikazees in Japan. They died (via crashing their planes into ships) to kill the enemies for the homeland. You can call them crazy i you want but their act was not one of suicide that is despised by Islam. So the scholars main point is that “IT SEEMS AS THOUGH YOUR INTENTION IS THE MAIN ISSUE” and Allah (SWT) knows best.

Of course, it may not be the same Nidal Hasan. But there is more. One of his former colleagues, Col. Terry Lee, recalled Hasan saying statements to the effect of “Muslims have the right to rise up against the U.S. military”; “Muslims have a right to stand up against the aggressors”; and even speaking favorably about people who “strap bombs on themselves and go into Times Square.”

Maybe he just snapped, perhaps under the pressure of his imminent deployment to Iraq. But it’s noteworthy that if he did, he snapped in exactly the same way that several other Muslims in the U.S. military have snapped in the past. In April 2005, a Muslim serving in the U.S. Army, Hasan Akbar, was convicted of murder for killing two American soldiers and wounding fourteen in a grenade attack in Kuwait. AP reported: “Prosecutors say Akbar told investigators he launched the attack because he was concerned U.S. troops would kill fellow Muslims in Iraq. They said he coolly carried out the attack to achieve ‘maximum carnage’ on his comrades in the 101st Airborne Division.”

And Hasan’s murderous rampage resembles one that five Muslim men in New Jersey tried to carry out at Fort Dix in New Jersey in 2007, when they plotted to enter the U.S. Army base and murder as many soldiers as they could.

That was a jihad plot. One of the plotters, Serdar Tatar, told an FBI informant late in 2006: “I’m gonna do it….It doesn’t matter to me, whether I get locked up, arrested, or get taken away, it doesn’t matter. Or I die, doesn’t matter, I’m doing it in the name of Allah.” Another plotter, Mohamad Shnewer, was caught on tape saying, “They are the ones, we are going to put bullets in their heads, Allah willing.”

Nidal Hasan’s statements about Muslims rising up against the U.S. military aren’t too far from that, albeit less graphic. The effect of ignoring or downplaying the role that Islamic beliefs and assumptions may have played in his murders only ensures that – once again – nothing will be done to prevent the eventual advent of the next Nidal Hasan.

Thursday, November 05, 2009

Cosmic Justice

If evolution cannot explain how humans became moral primates, what can?

By Dinesh D'Souza
November 05, 2009, 4:00 a.m.

All evolutionary attempts to explain morality ultimately miss the point. They seek to explain morality, but even at their best what they explain is not morality at all. Imagine a shopkeeper who routinely increases his profits by cheating his customers. So smoothly does he do this that he is never exposed and his reputation remains unimpeached. Even though the man is successful in the game of survival, if he has a conscience it will be nagging at him from the inside. It may not be strong enough to make him change his ways, but it will at least make him feel bad and perhaps ultimately despise himself. Now where have our evolutionary explanations accounted for morality in this sense?

In fact, they haven’t accounted for it at all. These explanations all seek to reduce morality to self-interest, but if you think about it, genuine morality cannot be brought down to this level. Morality is not the voice that says, “Be truthful when it benefits you,” or “Be kind to those who are in a position to help you later.” Rather, it operates without regard to such calculations. Far from being an extension of self-interest, the voice of the impartial spectator is typically a restriction of self-interest. Think about it: If morality were simply an extension of selfishness, we wouldn’t need it. We don’t need moral prescriptions to tell people to act for their own benefit; they do that anyway. The whole point of moral prescriptions and injunctions is to get people to subordinate and curb their selfish interests.

There is a second, deeper sense in which evolutionary theories cannot account for human morality. We can see this by considering the various attempts to explain altruism in the animal kingdom. I recently came across an article in the London Telegraph titled “Animals Can Tell Right from Wrong.” I read with interest, wondering if animals had finally taken up the question of whether it is right to eat smaller animals. After all, the greatest problem with animal rights is getting animals to respect them. Alas, the article was unilluminating on this point. Even so, it provided examples of how wolves, coyotes, elephants, whales, and even rodents occasionally engage in cooperative and altruistic behavior. Perhaps the most dramatic examples come from the work of the anthropologist Frans de Waal, who has studied gorillas, bonobos, and chimpanzees. According to de Waal, our “closest relatives,” the chimpanzees, display many of the recognized characteristics of morality, including kin selection and reciprocal altruism.

Yet de Waal recognizes that while chimps may cooperate or help, they have no sense that they ought to help. In other words, chimps have no understanding of the normative basis of morality. And this of course is the essence of morality for humans. Morality isn’t merely about what you do; mostly it is about what you should do and what it is right to do. Evolutionary theories like kin selection and reciprocal altruism utterly fail to capture this uniquely human sense of morality as duty or obligation. Such theories can help to explain why we act cooperatively or help others, but they cannot explain why we believe it is good or right or obligatory for us to do these things. They commit what the philosopher G. E. Moore called the “naturalistic fallacy” of confusing the “is” and the “ought.” In particular, they give an explanation for the way things are and think that they have accounted for the way things ought to be.

But if evolution cannot explain how humans became moral primates, what can? Now it is time to test our presuppositional argument. The premise of the argument is that virtually all conceptions of life after death, especially the religious conceptions, are rooted in the idea of cosmic justice. Consider Hinduism: “You are a greedy and grasping person in this life; very well, we’ll be seeing you as a cockroach in the next one.” Buddhism has a very similar understanding of reincarnation. Judaism, Islam, and Christianity, by contrast, uphold the notion of a Last Judgment in which the virtuous will be rewarded and the wicked will get their just deserts. The Letter to the Galatians contains the famous quotation, “Whatever a man sows, that he will also reap” (6:7). And here is a similar passage from the third sura of the Koran: “You shall surely be paid in full your wages on the Day of Resurrection.” In all these doctrines, life after death is not a mere continuation of earthly existence but rather a different kind of existence based on a settling of earthly accounts. These doctrines hold that even though we don’t always find terrestrial justice, there is ultimate justice. In this future accounting, what goes around does come around.

Now let’s make the supposition that there is cosmic justice after death and ask, Does this help to explain the great mystery of human morality? It seems clear that it does. Humans recognize that there is no ultimate goodness and justice in this world, but they continue to uphold those ideals. In their interior conscience, humans judge themselves not by the standard of the shrewd self-aggrandizer but by that of the impartial spectator. We admire the good man, even when he comes to a bad end, and revile the successful scoundrel who got away with it. Evolutionary theories predict the reverse: If morality were merely a product of crafty and successful calculation, we should cherish and aspire to be crafty calculators. But we don’t. Rather, we act as if there is a moral law to which we are accountable. We are judged by our consciences as if there is an ultimate tribunal in which our actions will be pronounced “guilty” or “not guilty.” There seems to be no reason for us to hold these standards and measure our life against them if the standards aren’t legislative in some sense. But if they are legislative, then their jurisdiction must be in another world since it is clearly not in this world. So the presupposition of cosmic justice, in an existence beyond this one, makes sense of human moral standards and moral obligation in a way that evolutionary theories cannot.

Ironically it is the claims of atheists that best illustrate the point I am trying to make. In the last pages of The Selfish Gene, a book devoted to showing how we are the mechanical products of our selfish genes, Richard Dawkins writes that “we have the power to turn against our creators. . . . Let us understand what our own selfish genes are up to because we may then at least have the chance to upset their designs.” A century ago Thomas Huxley made the same point in regard to the cosmic process of evolutionary survival. “Let us understand, once for all, that the ethical progress of society depends, not on imitating the cosmic process, still less in running away from it, but in combating it.” Now these are very strange demands. If we are, as Dawkins began by telling us, robot vehicles of our selfish genes, then how is it possible for us to rebel against them or upset their designs? Can the mechanical car turn against the man with the remote control? Can software revolt against its programmer? Clearly this is absurd.

Why, then, would Dawkins and Huxley propose a course of action that undermines their own argument and seeks to runs athwart the whole course of evolution? If we stay within the evolutionary framework, there is no answer to this question. There cannot be, because we are trying to understand why dedicated champions of evolution seek to transcend evolution and, in a sense, subvert their own nature. We don’t see anything like this in the animal kingdom: Lions don’t resolve to stop harassing the deer; foxes don’t call upon one another to stop being so sneaky; parasites show no signs of distress about taking advantage of their hosts. Even apes and chimpanzees, despite their genetic proximity to humans, don’t try to rebel against their genes or become something other than what nature programmed them to be.

What then is up with us humans? What makes even the atheist uphold morality in preference to his cherished evolutionary paradigm? Introduce the presupposition of cosmic justice, and the answer becomes obvious. We humans — atheists no less than religious believers — inhabit two worlds. The first is the evolutionary world; let’s call this Realm A. Then there is the next world; let’s call this Realm B. The remarkable fact is that we, who live in Realm A, nevertheless have the standards of Realm B built into our natures. This is the voice of morality, which makes us dissatisfied with our selfish natures and continually hopeful that we can rise above them. Our hypothesis also accounts for the peculiar nature of morality. It cannot coerce us because it is the legislative standard of another world; at the same time, it is inescapable and authoritative for us because our actions in this world will be finally and unavoidably adjudicated in the other world. Finally, the hypothesis also helps us understand why people so often violate morality. The reason is that our interests in this world are right in front of us, while the consequences of our actions in the next world seem so remote, so distant, and thus so forgettable.

When Einstein discovered that his theory of relativity could explain something that Newton couldn’t — the orbital precession of the planet Mercury — he was thrilled. He knew about the “gap,” and he was able to close it not within the old framework but by supplying a revolutionary new one. Now, within the new paradigm, there was no gap at all. In this essay we have identified not a mere gap but a huge chasm in the evolutionary paradigm. This is the conundrum of human morality, the universal voice within us that urges us to act in ways contrary to our nature as evolutionary primates. There have been supreme efforts, within the evolutionary framework, to plug the gap, but, as we have seen, these have proven to be dismal failures. Our rival hypothesis of cosmic justice in a world beyond the world fares vastly better. It provides a way to test our hypothesis of life after death by applying it to human nature and asking whether it helps to illuminate why we are the way we are. In fact, it does. Taken in conjunction with other arguments, this argument provides stunning confirmation that the moral primate is destined for another life whose shape will depend on the character of the life that is now being lived.

— Dinesh D'Souza is the Rishwain fellow at the Hoover Institution. This the third of a three-part adaptation from his just-published Life after Death: The Evidence.

— Dinesh D'Souza, the Rishwain fellow at the Hoover Institution, is author most recently of The Enemy at Home.

Fitting end to perfect season for Bombers

November 5, 2009

NEW YORK – The night ended with a jungle of arms and legs at the mound, raised fists and the kind of shouting that men use as substitutes for tears. This was exactly how the Yankees envisioned the end of their magical season – in bedlam, with looks that needed no translation:

The universe is ours.

NEW YORK - NOVEMBER 04: (L-R) A.J. Burnett #34, Jorge Posada #20, Derek Jeter#2, Mariano Rivera #42 (holding trophy) Robinson Cano #24 and Nick Swisher #33 of the New York Yankees against the Philadelphia Phillies in Game Six of the 2009 MLB World Series at Yankee Stadium on November 4, 2009 in the Bronx borough of New York City. (Photo by Chris McGrath/Getty Images)

The World Series became a billboard of the Yankees’ excellence, as their 7-3 victory in Game Six will be cherished for so many reasons: Andy Pettitte out-pitching Pedro Martinez, Hideki Matsui, the Series’ Most Valuable Player, driving in six runs, and Mariano Rivera finishing it all off with a five-out save.

Just how badly could the Yankees taste their first championship since 2000? Half a dozen of them poured out of the dugout even before Mark Teixeira had caught Robinson Cano’s throw retiring Shane Victorino.

Instantly, Queen blasted “We are the Champions” on the Stadium PA system while the riot on the field kept growing. Soon, it was Sinatra crooning “New York, New York” for the last time in 2009, everyone singing along, woozy from the baseball-high.

Joe Girardi would say, “it’s fitting that we won this at home in front of these fans” but the entire night was filled with precious irony: there was Derek Jeter taking a shot at Jimmy Rollins, saying, “a lot of people made predictions about this Series.” There was Pettitte getting a chance to pitch a championship clincher, and Matsui, probably playing his final game in Pinstripes, saying goodbye with a performance that bordered on Reggie-like greatness.

He crushed a two-run HR off Pedro in the second inning, and that was all the Yankees had to know. The Phillies had spoken bravely about a miracle comeback, but in their hearts they knew the Series was over the night Johnny Damon stole Game 4 from them with two stolen bases in the ninth inning.

The Phillies put up a better fight than the Twins or Angels, but in the end they, too, had no answer for the Yankees’ nuclear lineup. Only Cliff Lee had the guts to stare the Bombers in the eye; everyone else was dealing from a weakened hand – even Pedro. He never once reached 90-mph on the radar gun, which struck Charlie Manuel almost instantly. The manager said, “(at) 84, 85-mph, (Pedro) is better than that. He did not have a good fastball.”

No one had to explain that to the usually fearless Martinez. He was in crisis-mode pitching around Alex Rodriguez in the second inning, walking him on four pitches before taking his chances with Matsui. Pedro wasn’t just cautious, he was flat-out intimidated, knowing this was a night when any mistake would be costly.

Martinez gingerly worked the corners for seven pitches until Matsui finally wore him. Pedro left a four-seam fastball one over the plate. Matsui destroyed it, sending it screeching over the wall in right-center, on the way to a Series that will forever be remembered. The Japanese slugger finished with a .615 average and eight RBI.

He would later call it, “the greatest moment of my life” after “a long and difficult road” to a championship.

Matsui will leave exactly as he arrived in 2003 – humble and polite, an All-American kid who just happened to hail from the Far East.

“He’s as quiet a superstar as I’ve ever seen anywhere,” Bernie Williams once said. “He’s representing the whole Japanese culture here, it’s a lot of weight on his shoulders, and yet he doesn’t get rattled.”

NEW YORK - NOVEMBER 4: World Series MVP Hideki Matsui #55 of the New York Yankees celebrates with the MVP trophy after their 7-3 win against the Philadelphia Phillies in Game Six of the 2009 MLB World Series at Yankee Stadium on November 4, 2009 in the Bronx borough of New York City. (Photo by David J. Phillip-Pool/Getty Images)

Not even when accepting the MVP trophy from Bud Selig Wednesday night. A podium had been erected at second base after the game, where the entire organization gathered to celebrate. No one had gone home – 50,000 fans shouted in unison as Matsui shook hands with the commissioner and told the crowd it was “awesome” to go out on a such a powerful note.

Alex Rodriguez, enjoying his first world championship ever, told the horde, “we’re going to party” and Pettitte just smiled that big, toothy grin of his and said, “this is what I came back here for.”

Pettitte had just won his 18th career post-season victory, cementing his legacy as the Yankees’ greatest left-hander since Whitey Ford.

Good old Andy, always a second-tier star among Yankee pitchers, always forced to live in the shadow of Roger Clemens or David Cone or, most recently, CC Sabathia and A.J. Burnett.

It was impossible to watch Pettitte on Wednesday night and not think of the way the Yankees treated him over the winter – negotiating their way down instead of up as he pondered $10 million offer.

Pettitte was a free agent, free to test the market - or, if his family had pushed hard enough, retire. But he wanted one more year in Pinstripes, one more chance at a championship. One more season that would allow him to christen the new Stadium.

The Yankees knew Pettitte’s loyalty was unbreakable, especially this late in his career. There were no other teams circling, no other offers to take his eye off the target. It was the Bombers or no one else.

So they lowered their offer to $5.5 million and told him, sorry, you’re out of time and we’re out of money. GM Brian Cashman told his lefthander that the organization appreciated his years of excellence, but with $180 million being shelled out to Mark Teixeira and another $161 million to Sabathia, there were only a handful of dollars left for Pettitte.

Another pitcher would’ve spent the summer in a deep, angry funk. But not Pettitte, the most tolerant and forgiving man in the Yankees’ clubhouse.

Never once did he express regret at the way the Yankees conducted business. Never once did he consider going home. Once the contract was signed, skimpy as it was, Pettitte narrowed his focus to pitching, same as it ever was.

That’s what allowed him to come full circle – standing in front of an army of loyalists in the Bronx, shouting so loud he could he heard from here to Philly.

“This is a great, great night,” Pettitte said hoarsely, grinning broadly enough to turn his eyes into slits. That expression was priceless: the universe is finally ours.

Wednesday, November 04, 2009


By Ann Coulter
November 4, 2009

-- MSNBC, Aug. 31, 2009, Keith Olbermann on Robert F. McDonnell, Republican candidate for governor of Virginia:

"In [McDonnell's master's thesis], he described women having jobs as detrimental to the family, called legalized use of contraception illogical, pushed to make divorce more difficult, and insisted government should favor married couples over, quote, 'cohabitators, homosexuals or fornicators.' Wow. When did he write this? 1875? No, 1989. Wow, 1989.

"Goodbye, Mr. McDonnell."

-- MSNBC, Sept. 22, 2009, Rachel Maddow also on McDonnell:

"And here's where the conservative movement and the Republican establishment smash into each other like bumper cars without bumpers. Here's where Republican electoral chances stop being separate from the wild-eyed excesses of the conservative movement.

"Part of watching Republicans try to return to power is watching ... the conservative movement eat the Republican Party, eat their electoral chances over and over and over again."

On election night, conservatives-eating-Republicans resulted in an 18-point landslide for McDonnell, who beat his Democratic opponent 59 percent to 41 percent -- winning two-thirds of all independent voters and ending the Democrats' eight-year reign in the Virginia governor's office.

Republicans swept all statewide offices for the first time in 12 years, winning the races for lieutenant governor and attorney general, as well as assembly seats, garbage inspector, dog catcher and anything else Virginians could vote for.

To paraphrase a pompous blowhard: Goodbye, Mr. Democrat.

And that's not the most exciting news from election night! Astoundingly, Jon Corzine, the incumbent governor of heavily Democratic New Jersey -- a state that Barack Obama won by 16 points just a year ago -- lost by 5 points.

At 49 percent for Republican Chris Christie versus 44 percent for Corzine, the election wasn't even close enough to be stolen by ACORN. (Although Corzine did extremely well among underaged Salvadoran prostitutes living in government housing.)

The biggest winner election night was pollster Scott Rasmussen, who -- once again -- produced the most accurate poll results. New York Times poll: Corzine 40, Christie 37; Quinnipiac poll: Corzine 43, Christie 38; Rasmussen poll: Christie 46, Corzine 43.

The biggest loser was President Obama, who campaigned tirelessly for Corzine, even giving up golf on several occasions and skipping a quarter-million-dollar "date night" with Michelle to stump for the Democrat.

Just two days before the election, Obama was at a rally in New Jersey assuring voters that Corzine was "one of the best partners I have in the White House. We work together. ... Jon Corzine helped get this done."

Except the problem is that voting for Obama a year ago was a fashion statement, much like it was once a fad to buy Beanie Babies, pet rocks and Cabbage Patch Kids. But instead of ending up with a ridiculous dust-collector at the bottom of your closet, the Obama fad leaves you with higher taxes, a reduced retirement fund, no job and a one-year wait for an MRI.

That is why Corzine's defeat sounded the death knell for national health care.

The good news: Next time Corzine is in a major car accident after speeding on the New Jersey Turnpike, he'll be able to see a doctor right away.

The media will try to rescue health care by talking about nothing but the 23rd district of New York, where the Democrat won Tuesday night. Congratulations, Democrats -- you won a congressional seat in New York! Next up: A Catholic elected pope!

Far from an upset, the Democrats' winning the 23rd district was a long-term plan of the Obama White House. That's why Obama made John McHugh, the moderate Republican congressman representing the 23rd district, his Secretary of the Army earlier this year. The Democrats thought McHugh's seat would be easy pickings.

Only in the last week has everyone acted as if a Democratic victory in the 23rd district would be a shocking surprise -- an upset victory caused by puritanical Republicans staging inquisitions against "mainstream" Republican candidates like Dede Scozzafava, the designated "Republican" candidate in the special election.

This is preposterous -- there was absolutely nothing Republican about Scozzafava. As a supporter of partial-birth abortion, card-check union schemes and massive government spending programs, she was less Republican than John McCain.

Even Markos Moulitsas of Daily Kos called Scozzafava the most liberal candidate in the race -- which may explain why she was the choice of George Soros' Working Families Party and why she promptly endorsed the Democrat after withdrawing from the race last weekend.

Conservative opposition to Scozzafava hardly suggests that they plan to impose litmus tests on every Republican candidate in the 2010 elections.

Speaking of litmus tests, on MSNBC recently, liberal blogger Jane Hamsher said of the possibility that a blue dog Democrat would oppose national health care: "I dare Blanche Lincoln -- I dare Blanche Lincoln to join a filibuster. She'll draw primary opponents so fast it would make your head spin."

While I'm sure an out-of-touch liberal blogger from Hollywood knows more about Arkansas than an elected senator from that state, Hamsher's threat sounds more like an intra-party civil war than conservatives opposing a George Soros-supported Republican candidate in a New York congressional race.

Not only do conservatives not pick insane fights -- such as staging a 2006 primary fight against a recent vice presidential candidate because he supported the war in Iraq -- but conservatives are more popular than Republicans.

By contrast, liberals are less popular than Democrats. When conservatives take control of the Republican Party, Republicans win. When liberals take control of the Democratic Party, Democrats end up out of power for eight to 12 years.


True Conservatives Just Want a Turn

Conservatives have had to put up with a lot of moderation and ideological flexibility.

By Jonah Goldberg
November 04, 2009, 0:00 a.m.

If there’s one thing liberal pundits are experts on these days, it’s the sorry state of conservatism. The airwaves and op-ed pages brim with more-in-sorrow-than-in-anger lamentations on the GOP’s failure to get with President Obama’s program, the party’s inevitable demographic demise, and its thralldom to the demonic deities of the Right — Limbaugh, Beck, Palin.

Such sages as the New York Times’s Sam Tanenhaus and Frank Rich insist that the Right is out of ideas. After all, the religious dogmatism and “market fundamentalism” of the Bush administration were entirely discredited, leaving the GOP with its intellectual cupboard bare.

“During the two terms of George W. Bush,” Tanenhaus declares in his latest book, “conservative ideas were not merely tested but also pursued with dogmatic fixity.”

Even worse than being brain-dead, the right is black-hearted, hating good-and-fair Obama for his skin color and obvious do-goodery.

Predictably, Republican Dede Scozzafava’s withdrawal from the congressional race in New York’s 23rd District is not only proof the experts are right, but also conveniently a more important story than the Democrats’ parlous standing with voters. Don’t look at the imploding Democrats. No, let’s all titter at the cannibalistic “civil war” on the Right.

Frank Rich, gifted psephologist, finds the perfect parallel to the GOP’s squabbles in Stalin’s murderous purges.

“Though they constantly liken the president to various totalitarian dictators,” Rich writes, “it is they who are re-enacting Stalinism in full purge mode.” Stalin’s “full purge mode” involved the systematized exile and slaughter of hundreds of thousands (not counting his genocide of millions). The GOP’s purge has so far caused one very liberal Republican to halt her bid for Congress.

Let me offer a counter-theory, admittedly lacking in such color but making up for it with evidence and consideration of what conservatives actually believe.

After 15 or 20 years of steady moderation, many conservatives think it might be time to give their ideas a try.

Bush’s “compassionate conservatism” was promoted as an alternative to traditional conservatism. Bush promised to be a “different kind of Republican,” and he kept that promise. He advocated government activism, and he put our money where his mouth was. He federalized education with No Child Left Behind — co-sponsored by Teddy Kennedy — and oversaw the biggest increase in education spending in history (58 percent faster than inflation), according to the Heritage Foundation, while doing next to nothing to advance the conservative idea known as school choice.

With the prescription-drug benefit, he created the biggest new entitlement since the Great Society (Obama is poised to topple that record). Bush increased spending on the National Institutes of Health by 36 percent and international aid by 74 percent, according to Heritage. He oversaw the largest, most porktacular farm bills ever. He signed the Sarbanes-Oxley Act, a massive new regulation of Wall Street. His administration defended affirmative action before the Supreme Court.

He pushed amnesty for immigrants, imposed steel tariffs, supported Title IX, and signed the McCain-Feingold campaign-finance-reform legislation.

Oh, and he, not Obama, initiated the first bailouts and TARP.

Not all of these positions were wrong or indefensible. But the notion that Bush pursued conservative ideas with “dogmatic fixity” is dogmatic nonsense.

Most Democrats were blinded to all of this because of their anger over the Iraq War and an often irrational hatred of Bush. Republicans, meanwhile, defended Bush far more than they would have had it not been for 9/11 and the hysteria of his enemies.

In 2008, the primaries lacked a Bush proxy who could have siphoned off much of the discontent on the Right. Moreover, the party made the political calculation that John McCain — another unorthodox and inconsistent conservative — was the best candidate to beat Obama.

In short, conservatives have had to not only put up with a lot of moderation and ideological flexibility, we’ve had to endure nearly a decade of taunting from gargoyles insisting that the GOP is run by crazed radicals.

The rank and file might be wrong to want to get back to basics, but I don’t think so. With Obama racing to transform America into a European welfare state fueled by terrifying deficit spending, this seems like a good moment to argue for limited government.

Oh, and a little forgiveness, please, for not trusting the judgment of the experts who insist they know what’s happening on the racist, paranoid, market-fundamentalist, Stalinist Right.

— Jonah Goldberg is editor-at-large of National Review Online and the author of Liberal Fascism: The Secret History of the American Left from Mussolini to the Politics of Meaning. © 2009 Tribune Media Services, Inc.

The Surprising Fact of Morality

Evolutionists have some ingenious explanations for morality. But do they work?

By Dinesh D'Souza
November 04, 2009, 4:00 a.m.

Morality is both a universal and a surprising fact about human nature. When I say that morality is universal I am not referring to this or that moral code. In fact, I am not referring to an external moral code at all. Rather, I am referring to morality as the voice within, the interior source that Adam Smith called the “impartial spectator.” Morality in this sense is an uncoercive but authoritative judge. It has no power to compel us, but it speaks with unquestioned authority. Of course we can and frequently do reject what morality commands, but when we do so we cannot avoid guilt or regret. It is because of our capacity for self-impeachment and remorse that Aristotle famously called man “the beast with the red cheeks.” Aristotle’s description holds up very well more than 2,000 years later. Even people who most flagrantly repudiate morality — say, a chronic liar or a rapacious thief — nearly always respond to detection with excuses and rationalizations. They say, “Yes, I lied, but I had no alternative under the circumstances,” or “Yes, I stole, but I did so to support my family.” Hardly anyone says, “Of course I am a liar and a thief, and I don’t see anything wrong with that.” What this means is that morality supplies a universal criterion or standard even though this standard is almost universally violated.

Morality is a surprising feature of humanity because it seems to defy the laws of evolution. Evolution is descriptive: It says how we do behave. Morality is prescriptive: It says how we should behave. And beyond this, evolutionary behavior appears to run in the opposite direction from moral behavior. Evolution implies that we are selfish creatures who seek to survive and reproduce in the world. Indeed we are, but we are also unselfish creatures who seek the welfare of others, sometimes in preference to our own. We are participants in the fame of life, understandably partial to our own welfare, while morality stands aloof, taking the impartial, or “God’s eye,” view, directing us to act in a manner conducive to the good of others. In sum, while evolution provides a descriptive account of human self-interest, morality provides a standard of human behavior that frequently operates against self-interest.

So if we are mere evolutionary primates, how to account for morality as a central and universal feature of our nature? Why would morality develop among creatures obsessively bent on survival and reproduction? Darwin himself recognized the problem. In The Descent of Man, Darwin argued that “although a high standard of morality gives but a slight or no advantage to each individual man and his children over the other men of the same tribe, yet . . . an advancement in the standard of morality will rtainly give an immense advantage to one tribe over another.” Darwin’s point is that a tribe of virtuous patriots, with each of its members willing to make sacrifices for the group, would prove more successful and thus be favored by natural selection over a tribe of self-serving individuals. This is the group-selection argument, and for many decades it was considered an acceptable way to reconcile evolution with morality.

But as biologists now recognize, the argument has a fatal flaw. The question we have to ask is how a tribe of individuals would become self-sacrificing in the first place. Imagine a tribe where, for instance, many people shared their food with others or volunteered to defend the tribe from external attack. Now what would be the fate of individual cheaters who benefited from this arrangement but hoarded their own food and themselves refused to volunteer to fight? Clearly these scoundrels would have the best deal of all. In other words, cheaters could easily become free riders, benefiting from the sacrifices of others but making no sacrifices themselves, and they would be more likely to survive than their more altruistic fellow tribesmen.

In The Origins of Virtue Matt Ridley gives a more contemporary example. If everyone in a community could be relied on not to steal cars, cars would not have to be locked, and a great deal of expense would be saved on insurance, locking devices, and alarms. The whole community would be better off. But, Ridley notes, “In such a trusting world, an individual can make himself even better off by defecting from the social contract and stealing a car.” By this logic, even tribes that somehow started out patriotic and altruistic would soon become filled with self-serving cheaters. The free-rider problem doesn’t apply to all situations — there are very limited circumstances in which group selection still works — but its discovery has pretty much sunk Darwin’s group-selection argument as a general explanation for morality within an evolutionary framework.

In the 1960s and early 1970s, biologists William Hamilton and Robert Trivers offered an entirely new and more promising way to solve the problem. Their work is summarized in Richard Dawkins’s best book, The Selfish Gene. Drawing on the research of Hamilton and Trivers, Dawkins argues that the basic unit of survival is not the individual but rather the gene. In one of his most memorable formulations, he writes that we individuals are “survival machines — robot vehicles blindly programmed to preserve the selfish molecules known as genes.” At first glance this seems like a crazy way to think about evolution, but Dawkins, employing a presuppositional argument of his own, notes that if we think of things in this way, we can explain morality to a degree that previously seemed impossible.

The ingenuity of the selfish-gene theory is that it explains morality as a result not of individual selfishness but rather of genetic selfishness. “Altruism,” writes biologist E. O. Wilson, “is conceived as the mechanism by which DNA multiplies itself.” This may seem like a cold way to think about altruism, but there is some logic behind it. Think of a mother who runs into a burning building to save her two children trapped inside. An act of pure maternal unselfishness? Well, it looks that way. But William Hamilton reminds us that 50 percent of a child’s genes come from the mother. If two or more children are involved, then it makes rational sense for a mother to jeopardize her own survival if she can enhance the prospects of her genes surviving through her offspring. What looks like altruism from the individual point of view can be understood as selfishness from the genetic point of view.

Morality, in Hamilton’s framework, is a form of nepotistic “kin selection.” This idea helps us understand why certain insects, birds, and animals endanger their own welfare to promote that of their fellow creatures. Vervet monkeys and prairie dogs, for instance, give warning calls that signal approaching predators, sometimes at the cost of becoming the target of those predators. Why would they risk their lives in this way? Kin selection holds that it is because they are genetically related to those they are helping. So there is an evolutionary payoff: The risk-takers are maximizing not their individual chance for survival but the chance for their genes to make it into future generations. From the gene’s point of view, helping one’s kin is simply a form of helping oneself.

But of course kin selection is a very limited explanation, in that it only accounts for why animals and people behave altruistically toward relatives. In life, however, humans and even some animals behave that way toward innumerable others who don’t share their genes. Robert Trivers argued that this is because of “reciprocal altruism.” A better term would be reciprocal bargaining: What Trivers means is that creatures behave generously toward others in the expectation that they will get something in return. Vampire bats, for instance, share food not only with relatives but also with other bats who have recently shared with them. Other animals also practice this kind of give-and-take. Trivers does not suggest that animals engage in conscious planning or deliberation; rather, he argues that natural selection has rewarded with survival the instincts for engaging in mutually beneficial exchanges. And of course in human society we routinely exchange favors with neighbors and acquaintances; we even do business with total strangers, all motivated by the principle of “you do something for me and I’ll so something for you.” So here too altruism is understood as a form of extended or long-term selfishness.

Even reciprocal altruism, however, cannot explain the good things that we do that offer no actual return. A fellow gets up to give his seat on the bus to an 80-year-old woman. No, she isn’t grandma, nor is it reasonable to say that he’s doing it so that next week she will give him her seat. So neither kin selection nor reciprocal altruism provides any solution in this case. Moreover, altruism of this sort occurs on a regular basis throughout human society. Many people give blood without any expectation of return. Others volunteer to help the severely disabled. Others donate money to purchase malaria nets or to assist AIDS victims in Africa. Still others agitate against animal abuse in their own community or sex trafficking in Thailand or religious persecution in Tibet. Throughout the centuries there have been people who have devoted themselves to improving the lives of impoverished strangers, or risked their lives to benefit folks who are unrelated to them and cannot possibly reciprocate these sacrifices.

Some biologists concede that evolution is at a loss here. “Altruism toward strangers,” writes biologist Ernst Mayr, “is a behavior not supported by natural selection.” Still, some diehard champions of evolution do try to accommodate such behavior within their evolutionary framework. Their best attempt is to argue that seemingly disinterested altruism toward strangers has a well-hidden personal motive. Essentially it is performed in order to enhance one’s social reputation. Reputation is valuable because it raises one’s position in society and perhaps even improves one’s mating prospects. Michael Shermer recognizes that it is possible to gain a good reputation by faking a dedication to the public welfare. He argues, however, that such schemes may well be exposed over time. According to Shermer, “The best way to convince others that you are a moral person is not to fake being a moral person but actually to be a moral person.” Psychologist David Barash makes the same point: “Be moral, and your reputation will benefit.” The motive here remains one of personal enhancement; we are helping others not for their sake but for our sake. Once again, morality is explained as the outward disguise of the selfish gene.

But Shermer and Barash never really contend with the Machiavellian objection to their argument. Machiavelli argues that “the man who wants to act virtuously in every way necessarily comes to grief among so many who are not virtuous.” A rich man who is habitually generous, Machiavelli remarks, will soon become a poor man. Much better, Machiavelli craftily counsels, to acquire the image of magnanimity while giving away as little as possible. In other words, it is preferable to seem virtuous than to actually be virtuous. “Everyone sees what you appear to be, few experience what you really are.” Machiavelli insists that the people who prosper most in the world are the ruthless people who employ virtue only occasionally and instrumentally for strategic gain. If Machiavelli is right, then under the rules of natural selection it is the moral pretenders, not the truly moral, who will prosper and multiply. And for empirical evidence Machiavelli could surely point to the successful connivers in our society and every other one.

Of course if there is cosmic justice in the afterlife, then the bad guys ultimately lose. We see this in a beautiful example from Dante’s Inferno, where in the circle of the fraudulent we encounter Guido da Montefeltro. Guido’s martial prowess as a Ghibelline general has been largely due to his mastery of what he calls the “arts of the fox.” He is highly successful in his scams, and is never called to account. In short, he is a true Machiavellian. And late in life he dons the robes of a Franciscan friar, not because he repents of his earlier misdeeds, but in an attempt to fool God and make it to paradise. “And oh, to think it might have worked!” he says in one of the great lines of the Commedia. Unlike gullible humans, however, God can’t be duped, and so Guido gets his comeuppance.

As we see from this example, cosmic justice always evens the scales, but it simply defies reality to contend that in this world the scales are always even. Terrestrial justice is flawed and imperfect, and thus Barash and Shermer’s claim that morality always pays right here on earth isn’t very convincing.

— Dinesh D’Souza is the Rishwain fellow at the Hoover Institution. This is the second of a three-part adaptation from his just-published Life after Death: The Evidence.

— Dinesh D'Souza, the Rishwain fellow at the Hoover Institution, is author most recently of The Enemy at Home.

Tuesday, November 03, 2009

The Impartial Spectator

A moral argument for life after death.

By Dinesh D'Souza
November 03, 2009, 4:00 a.m.

To feel much for others and little for ourselves, to restrain our selfish and indulge our benevolent affections, constitutes the perfection of human nature.

— Adam Smith, The Theory of Moral Sentiments.

In this essay, I offer an original argument for life after death. This is called the presuppositional argument, and it requires a little clarification to show what kind of an argument it is and how it works. Imagine a detective who cannot figure out how his suspect could have committed the crime by himself. For instance, the suspect was indisputably in one location at the time when the body was dumped in another location. Our Lieutenant Columbo puzzles over this and then it hits him: The man must have had an accomplice. Assume an accomplice, and the otherwise inexplicable facts of the case now make sense. So there must have been an accomplice. And even though we don’t know anything about the accomplice, the detective’s hypothesis is persuasive to the degree that it explains the known facts of the case.

Here’s a second example. A woman is baffled by the fact that a man whom she has been dating for years keeps delaying his proposal of marriage. The man keeps telling her that he wants to wait for the right time. She agonizes over the question, “Why won’t he commit?” After a while the woman’s friends start telling her, “He will never marry you. He has no intention of marrying you.” The girlfriends have no direct knowledge of the man or his real intentions. Their assessment is, in this sense, purely conjectural. But it has the merit of being able to explain things that the alternative hypothesis cannot explain. How believable is it that the man who has procrastinated for so long will propose to this woman at some unspecified “right time”? It is much more reasonable to suppose that he is simply making excuses because he doesn’t want to get married, at least not to her. In both these examples there is a presupposition of a fact that is not directly known, but the presupposition is convincing because it makes sense of the facts that are known. The facts become, as it were, an empirical test of the validity of the presupposition.

Here is my presuppositional argument for life after death. Unlike material objects and all other living creatures, we humans inhabit two domains: the way things are, and the way things ought to be. In other words, we are moral animals who recognize that just as there are natural laws that govern every object in the universe, there are also moral laws that govern the behavior of one special set of objects in the universe, namely us. While the universe is externally moved by “facts,” we are internally moved also by “values.” Yet these values defy natural and scientific explanation, because the laws of nature, as discovered by science, concern only the way things are and not the way they ought to be. Moreover, the essence of morality is to curtail and contradict the powerful engine of human self-interest, giving morality an undeniable anti-evolutionary thrust. So how do we explain the existence of moral values that stand athwart our animal nature? The presupposition of cosmic justice, achieved not in this life but in another life beyond the grave, is by far the best and in some respects the only explanation. This presupposition fully explains why humans continue to espouse goodness and justice even when the world is evil and unjust.

Notice what the presuppositional argument does not say. It does not say that because there is injustice in the world there must be justice somewhere else. Nor does it say that the human wish for a better world is enough by itself to produce another world that is better. Rather, it begins with the recognition that while science explains much of nature very well, there is a big part of human nature that science does not seem to explain at all. In particular, evolution does a good job of accounting for why we are selfish animals, but it faces immense challenges in accounting for why we simultaneously hold that we ought not to be selfish. Far from facing the facts of life, like every other animal, we continue to cherish ideals that have never been and will never be fully achieved. We are flawed creatures who act as if we ought not to be. We know that we live in an unjust society where the bad guy often comes out on top and the good guy often comes to grief, yet we continue to hold that this is not how it should be. We continue to say things like “what goes around comes around” even though we know that in this world it is not always so. Despite the harsh facts of life, we tirelessly affirm that it should be so. Our ideals, in other words, contradict the reality of our lives. It seems that we, uniquely among all living and nonliving things, seek to repudiate the laws of evolution and escape the control of the laws of nature.

Now why is this? Why do we continue to operate as if there is a better world with a better set of ideals that stands in judgment of this world? I will argue that the best explanation is that there is such a world. In other words, the presupposition of an afterlife and the realization of the ideal of cosmic justice makes sense of our moral nature better than any competing hypothesis.

Before we launch into our discussion, I need to anticipate and answer an objection that will already be surfacing for a certain type of reader. Skeptics will at this point be reacting scornfully to my claim that there are certain features of human nature that seem to defy scientific explanation. The phrase that will be dancing on their lips is “the God of the gaps.” What they mean is that I am appealing to God and the supernatural to account for things that science has not yet explained. As Carl Sagan wrote in The Varieties of Scientific Experience, “As science advances, there seems to be less and less for God to do.” For the skeptic, the appeal to gaps is a completely illegitimate mode of argument; just because science doesn’t have the answer now, that doesn’t mean it will not have the answer tomorrow, or at any rate someday. In this view, the God of the gaps is the last desperate move of the theist, who searches for the little apertures in the scientific understanding of the world and then hands over those areas to his preferred deity.

Some creationists do employ this kind of “gaps” reasoning in order to posit a supernatural creator. For instance, they contend that science cannot account for the Cambrian explosion, so God must have directly done that. But there is no reason to think that the Cambrian explosion defies natural explanation, even if we don’t have that explanation. So the skeptic’s “gaps” critique works against this type of opponent. But it doesn’t work with me, because my argument does not rely on God at all. In addition, while the skeptic typically fancies himself a champion of science, his whole line of argument is no less unscientific than that of the creationist. For the skeptic a gap is a kind of nuisance, a small lacuna in scientific knowledge that is conceded to exist as a kind of misfortune, and is expected soon to be cleared up. True scientists, by contrast, love and cherish gaps. They seek out gaps and work laboriously within these crevices because they hope that, far from being a small missing piece of the puzzle, the gap is actually an indication that the whole underlying framework is wrong, that there is a deeper framework waiting to be uncovered, and that the gap is the opening that might lead to this revolutionary new understanding.

Gaps are the mother lode of scientific discovery. Most of the great scientific advances of the past began with gaps and ended with new presuppositions that put our whole comprehension of the world in a new light. The presuppositional argument, in other words, is not some funny way of postulating unseen entities to account for seen ones, but rather is precisely the way that science operates and that scientists make their greatest discoveries. Copernicus, for example, set out to address the gaps in Ptolemy’s cosmological theory. As historian Thomas Kuhn shows, these gaps were well recognized, but most scientists did not consider their existence to be a crisis. After all, experience seemed heavily on the side of Ptolemy: The earth seems to be stationary, and the sun looks as if it moves. Kuhn remarks that many scientists sought to fill in the gaps by “patching and stretching,” i.e., by adding more Ptolemaic epicycles.

Copernicus, however, saw the gaps as an opportunity to offer a startling new hypothesis. He suggested that instead of taking it for granted that the earth is at the center of the universe and the sun goes around the earth, let’s suppose instead that the sun is at the center, and the earth and the other planets all go around the sun. When Copernicus proposed this, he had no direct evidence that it was the case, and he recognized that his theory violated both intuition and experience. Even so, he said, the presupposition of heliocentrism gives a better explanation of the astronomical data and therefore should be accepted as correct. Here is a classic presuppositional argument that closes a gap and in the process gives us a completely new perspective on our place in the universe.

Similarly, Einstein confronted gaps in the attempt of classical physics to reconcile the laws of motion with the laws of electromagnetism. Again, there were many who didn’t consider the gap to be very serious. Surely classical Newtonian science would soon figure things out, and the gap would be closed. It took Einstein’s genius to see that this gap was no small problem; rather, it indicated a constitutional defect with Newtonian physics as a whole. And without conducting a single experiment or empirical test, Einstein offered a presuppositional solution. He said that we have assumed for centuries that space and time are absolute, and this has produced some seemingly insoluble problems. So what if we change the assumption? What if we say that space and time are relative to the observer? Now we can explain observed facts about electromagnetism and the speed of light that could not previously be accounted for.

Einstein was able to test his theory by applying it to the orbital motion of the planet Mercury. Mercury was known to deviate very slightly from the path predicted by Newton’s laws. Another gap! And once again there was a prevailing complacent attitude that some conventional scientific explanation would soon close the gap and settle the anomaly. But in fact the gap was a clue that the entire Newtonian paradigm was inadequate. Einstein recognized his theory as superior to Newton’s when he saw that it explained the orbital motion of Mercury in a way that Newton couldn’t.

In the last few decades, scientists have accepted the existence of dark matter and dark energy, again on the basis of presuppositional arguments. Here too the problem arose from some gaps. When scientists measured the amount of matter in the universe, it was not enough to hold the galaxies together. When they measured the amount of energy, it was not enough to account for the accelerating pace of the expansion of the universe. Of course these could be considered mere gaps, soon to be eliminated with some new observation or equation, but leading scientists knew better. They recognized that we already know about the matter and energy that our instruments can measure, and these simply cannot account for the behavior of the universe and the galaxies. Consequently, there has to be some other kind of matter and energy, undetectable by all current scientific equipment and following no known scientific law. The gap, in other words, required a reformulation of the entire scientific understanding of matter and energy. On this basis, leading scientists posited the existence of dark matter and dark energy, and, despite initial skepticism, most scientists have accepted their existence because they help to explain phenomena that would otherwise remain largely unknown.

From these examples, we learn that science regularly posits unseen entities, from space-time relativity to dark matter, whose existence is affirmed solely on the basis that they explain the things that we can see and measure. We also learn that gaps are a good thing, not a bad thing, and the genuinely scientific approach is to ask whether they are clues that lead to a broader and deeper comprehension of things. We also learn how presuppositional arguments work best, both in science and outside of science. The presupposition itself is a kind of hypothesis. It says, “This is the way things have to be in order to make sense of the world.” We then test the presupposition by saying, “How well does it explain the world?” We cannot answer this question without asking, “Are there alternative explanations that work better?” If so, then we can do without the presupposition. If not, then the presupposition, unlikely though it may seem, remains the best explanation of the data that we have before us. We have to accept what it posits until a better explanation comes along. My hypothesis on offer is that “There has to be cosmic justice in a world beyond the world in order to make sense of the observed facts about human morality.” Let us proceed to test this hypothesis.

— The preceding is the first of a three-part adaptation from Dinesh D’Souza’s just-published Life after Death: The Evidence. The next installment will appear tomorrow.

— Dinesh D'Souza, the Rishwain fellow at the Hoover Institution, is author most recently of The Enemy at Home.