Saturday, February 26, 2011

The OIC and the Caliphate

The Islamic agenda is not coexistence, but dominion.

By Andrew C. McCarthy
February 26, 2011 4:00 A.M.

The Organization of the Islamic Conference is the closest thing in the modern world to a caliphate. It is composed of 57 members (56 sovereign states and the Palestinian Authority), joining voices and political heft to pursue the unitary interests of the ummah, the world’s 1.4 billion Muslims. Not surprisingly, the OIC works cooperatively with the Muslim Brotherhood, the world’s most extensive and important Islamist organization, and one that sees itself as the vanguard of a vast, grass-roots movement — what the Brotherhood itself calls a “civilizational” movement.

Muslims are taught to think of themselves as a community, a single Muslim Nation. “I say let this land burn. I say let this land go up in smoke,” Ayatollah Ruhollah Khomeini famously said of his own country in 1980, even as he consolidated his power there, even as he made Iran the point of his revolutionary spear. “We do not worship Iran, we worship Allah.” Muslims were not interested in maintaining the Westphalian system of nation states. According to Khomeini, who was then regarded by East and West as Islam’s most consequential voice, any country, including his own, could be sacrificed in service of the doctrinal imperative that “Islam emerges triumphant in the rest of the world.”

Because of that doctrinal imperative, the caliphate retains its powerful allure for believers. Nevertheless, though Islamists are on the march, it has somehow become fashionable to denigrate the notion that the global Islamic caliphate endures as a mainstream Islamic goal.

It was only a week ago that close to 2 million Muslims jammed Tahrir Square to celebrate the triumphant return to Egypt of Sheikh Yusuf Qaradawi, a Khomeini-esque firebrand who pulls no punches about Islam’s goal to “conquer America” and “conquer Europe.” Yet, to take these threats seriously is now to be dismissed as a fringe lunatic, a Luddite too benighted to grasp that American principles reflect universally held truths — truths to which the ummah, deep down, is (so we are told) every bit as committed as we are.

The caliphate is an institution of imperial Islamic rule under sharia, Muslim law. Not content with empire, Islam anticipates global hegemony. Indeed, mainstream Islamic ideology declares that such hegemony is inevitable, holding to that belief every bit as sincerely as the End of History crowd holds to its conviction that its values are everyone’s values (and the Muslims are only slightly less willing to brook dissent). For Muslims, the failure of Allah’s creation to submit to the system He has prescribed is a blasphemy that cannot stand.

The caliphate is an ideal now, much like the competing ideal of a freedom said to be the yearning of every human heart. Unlike the latter ideal, the caliphate had, for centuries, a concrete existence. It was formally dissolved in 1924, a signal step in Kemal Atatürk’s purge of Islam from public life in Turkey. Atatürk, too, thought he had an early line on the End of History. One wonders what he’d make of Erdogan’s rising Islamist Turkey.

What really dissolved the Ottoman caliphate was not anything so contemporary as a “freedom agenda,” or a “battle for hearts and minds.” It was one of those quaint military wars, waged under the evidently outdated notion that Islamic enemies were not friends waiting to happen — that they had to be defeated, since they were not apt to be persuaded.

It was, I suppose, our misfortune in earlier times not to have had the keen minds up to the task of vanquishing “violent extremism” by winning a “war of ideas.” We had to make do with dullards like Winston Churchill, who actually thought — get this — that there was a difference worth observing between Islamic believers and Islamic doctrine.

“Individual Muslims,” Churchill wrote at the turn of the century, demonstrated many “splendid qualities.” That, however, did not mean Islam was splendid or that its principles were consonant with Western principles. To the contrary, Churchill opined, “No stronger retrograde force exists in the world.” Boxed in by rigid sharia, Islam could only “paralyse the social development of those who follow it.” Reason had evolved the West, but Islam had revoked reason’s license in the tenth century, closing its “gates of ijtihad” — its short-lived tradition of introspection. Yet, sharia’s rigidity did not render Islam “moribund.” Churchill recognized the power of the caliphate, of the hegemonic vision. “Mohammedanism,” he concluded, remained “a militant and proselytising faith.”

As I recounted in The Grand Jihad, Churchill’s views were not eccentric. A modern scholar of Islam, Andrew Bostom, recalls the insights of C. Snouck Hurgronje, among the world’s leading scholars of Islam during World War I. In 1916, even in the dark hours of Ottoman defeat, he marveled at the grip the concept of Islamic hegemony continued to hold on the Muslim masses:

It would be a gross mistake to imagine that the idea of universal conquest may be considered as obliterated. . . . The canonists and the vulgar still live in the illusion of the days of Islam’s greatness. The legists continue to ground their appreciation of every actual political condition on the law of the holy war, which war ought never be allowed to cease entirely until all mankind is reduced to the authority of Islam — the heathen by conversion, the adherents of acknowledged Scripture [i.e., Jews and Christians] by submission.

Muslims, of course, understood the implausibility of achieving such dominance in the near term. Still, Hurgronje elaborated, the faithful were “comforted and encouraged by the recollection of the lengthy period of humiliation that the Prophet himself had to suffer before Allah bestowed victory upon his arms.” So even as the caliphate lay in ruins, the conviction that it would rise again remained a “fascinating influence” and “a central point of union against the unfaithful.”

Today, the OIC is Islam’s central point of union against the unfaithful. Those who insist that the 1,400-year-old dividing line between Muslims and non-Muslims is ephemeral, that all we need is a little more understanding of how alike we all really are, would do well to consider the OIC’s Cairo Declaration of 1990.[1] It is the ummah’s “Declaration of Human Rights in Islam,” proclaimed precisely because Islamic states reject the 1948 Declaration of Human Rights promulgated by the United Nations under the guidance of progressives in the United States and the West. That is, the leaders of the Muslim world are adamant that Western principles are not universal.

They are quite right about that. The Cairo Declaration boasts that Allah has made the Islamic ummah “the best community . . . which gave humanity a universal and well-balanced civilization.” It is the “historical role” of the ummah to “civilize” the rest of the world — not the other way around.

The Declaration makes abundantly clear that this civilization is to be attained by adherence to sharia. “All rights and freedoms” recognized by Islam “are subject to the Islamic Shari’ah,” which “is the only source of reference for [their] explanation or clarification.” Though men and women are said by the Declaration to be equal in “human dignity,” sharia elucidates their very different rights and obligations — their basic inequality. Sharia expressly controls freedom of movement and claims of asylum. The Declaration further states that “there shall be no crime or punishment except as provided for in Shari’ah” — a blatant reaffirmation of penalties deemed cruel and unusual in the West. And the right to free expression is permitted only insofar as it “would not be contrary to the principles of Shari’ah” — meaning that Islam may not be critically examined, nor will the ummah abide any dissemination of “information” that would “violate sanctities and the dignity of Prophets, undermine moral and ethical Values, or disintegrate, corrupt or harm society, or weaken its faith.”

Americans were once proud to declare that their unalienable rights came from their Creator, the God of Judeo-Christian scripture. Today we sometimes seem embarrassed by this fundamental conceit of our founding. We prefer to trace our conceptions of liberty, equality, free will, freedom of conscience, due process, privacy, and proportional punishment to a humanist tradition, haughty enough to believe we can transcend the transcendent and arrive at a common humanity. But regardless of which source the West claims, the ummah rejects it and claims its own very different principles — including, to this day, the principle that it is the destiny of Islam not to coexist but to dominate.

We won’t have an effective strategy for dealing with the ummah, and for securing ourselves from its excesses, until we commit to understanding what it is rather than imagining what it could be.

— Andrew C. McCarthy, a senior fellow at the National Review Institute, is the author, most recently, of The Grand Jihad: How Islam and the Left Sabotage America.



Friday, February 25, 2011

Today's Tune: Bruce Springsteen - The Brokenhearted

Obama’s Wishful Thinking

By Robert Spencer
February 25, 2011

In Barack Obama’s statement on the uprising in Libya Wednesday, he asserted somewhat counterfactually that “throughout this period of unrest and upheaval across the region the United States has maintained a set of core principles which guide our approach.” He added that “these principles apply to the situation in Libya” – and as he delineated them further, it became clear that he was siding strongly with the Libyan people and other Middle Eastern protesters, and that he was assuming that the recent Middle Eastern uprisings were all idealistic, humanistic pro-democracy movements. In reality, they’re anything but.

Obama condemned “the use of violence in Libya,” declaring that “the suffering and bloodshed is outrageous and it is unacceptable. So are threats and orders to shoot peaceful protesters and further punish the people of Libya.” He affirmed that “the United States also strongly supports the universal rights of the Libyan people,” and enumerated several of those rights: “That includes the rights of peaceful assembly, free speech, and the ability of the Libyan people to determine their own destiny.”

That phrasing itself suggested that Obama envisioned the crowds thronging the streets of Tripoli, crying out for Gaddafi’s blood and holding up pictures of him with Stars of David drawn on his forehead, as something akin to the Founding Fathers of the United States of America in Congress assembled. He saw Jefferson and Madison elsewhere, also, as he added that “even as we are focused on the urgent situation in Libya,” his Administration was working to determine “how the international community can most effectively support the peaceful transition to democracy in both Tunisia and in Egypt.”

Obama expressed satisfaction that “the change that is taking place across the region is being driven by the people of the region. This change doesn’t represent the work of the United States or any foreign power. It represents the aspirations of people who are seeking a better life.” And he quoted a Libyan who said: “We just want to be able to live like human beings.” In conclusion, he vowed that “throughout this time of transition, the United States will continue to stand up for freedom, stand up for justice, and stand up for the dignity of all people.”

The one thing the President didn’t explain was on what basis he believed that the Libyan (and Tunisian and Egyptian) people themselves were interested in principles and rights such as the freedom of speech and the dignity of all people, or held an understanding of freedom and justice remotely comparable to that of the American Constitutional system.

Unfortunately for him, there are numerous signs that they don’t. It is not insignificant vandalism that protesters in Libya have marked Gaddafi’s picture with the Star of David; rather, it is an indication of the protesters’ worldview, and of the pervasiveness of Islamic anti-Semitism. When Muslim protesters want to portray someone as a demon, they paint a Star of David on his picture. This also shows the naivete of Obama and others who insist that the demonstrators in Libya, Egypt (where the Star of David was drawn on Mubarak’s picture also) and elsewhere in the Middle East are pro-democracy secularists. They may be pro-democracy insofar as they want the will of the people to be heard, but given their worldview, their frame of reference, and their core assumptions about the world, if that popular will is heard, it will likely result in huge victories for the Muslim Brotherhood and similar pro-Sharia groups. Hence the ubiquitous chant of the Libyan protesters: not “Give me liberty or give me death,” but “No god but Allah!”

It’s also hard to reconcile Obama’s warm approval of these protests as “being driven by the people of the region” with the clear indications that the mood of the “people of the region” is decidedly anti-American. Even before CBS reporter Lara Logan was brutally raped in Cairo’s crowded Tahrir Square by a mob chanting, “Jew! Jew!,” several other mainstream media reporters from the United States were roughed up or otherwise imperiled, including Anderson Cooper and Christiane Amanpour. These two hard-Left journalists have repeatedly insisted that Islam is a Religion of Peace and that anyone who said otherwise was bigoted and racist; in Cairo, they ran up against the buzzsaw of reality.

Meanwhile, also in Egypt last Friday one of the biggest crowds of the entire Egyptian revolution thronged to Tahrir Square to hear Sheikh Yusuf al-Qaradawi, one of the most influential Muslim clerics in the world. Qaradawi, although he has been praised as a “reformist,” is actually a fanatical antisemite who is barred from entering the U.S., has given Islamic theological justification to suicide attacks against Israeli civilians, endorsed the death penalty for apostasy, and boasted that Islam would soon conquer Europe. Last Monday he called for the murder of Libya’s Gaddafi. The enthusiastic reception Qaradawi received in Cairo on Friday, along with the barring of secular liberal Wael Ghonim from the same stage, were ominous signs that genuine democracy is not in the offing in Egypt.

The likelihood that the Muslim Brotherhood would play a significant role in a post-Mubarak Egyptian government led two thousand Christians to mount a protest in Cairo last week, calling for a change in Egypt’s Constitution to guarantee a secular state. Although Egypt does not fully implement Islamic law, Article 2 of its Constitution currently stipulates that “Islam is the religion of the state. Arabic is its official language, and the principal source of legislation is Islamic jurisprudence (Sharia).” Protesters chanted: “Tell the people that the revolution is a cross and crescent!” One declared: “We sacrificed our souls for the sake of Egypt, and our aim was a civil state not a religious one. I came here to ask for equality, the Constitution has to be changed and article 2 removed.”

Nothing seems less likely to happen than that; the momentum is moving in the opposite direction. Aware of this fact, Russian President Dmitry Medvedev was more realistic than Obama when he remarked on the Middle East situation Tuesday. “These states are difficult,” Medvedev said, “and it is quite probable that hard times are ahead, including the arrival at power of fanatics. This will mean fires for decades and the spread of extremism.”

Obama should take heed, since – at least for the next two years – those fires will be his responsibility.

Article printed from FrontPage Magazine:

URL to article:

No matter what happens in NFL labor negotiations, the players pay the price

By Sally Jenkins
Washington Post Staff Writer
Thursday, February 24, 2011; 12:18 AM

From the exasperated fan's perspective, the NFL labor negotiation looks like an argument between billionaires and millionaires. But the truth is, no matter how well the union bargains, most players will end up broke - and broken in body. With all the picketing and anti-union sentiment swirling around, it's tempting to view players as lacking serious grievances compared with, say, public school teachers in Wisconsin. But let's pause a moment and carve up their paychecks in real terms.

The average NFL player lasts just 3.3 seasons, and most of his salary, no matter how high on paper, isn't guaranteed. The league minimum for a rookie is $310,000, and the median league salary is just less than $800,000. That's wildly extravagant - isn't it? Let's see.

Sixty-three percent of all NFL players suffered at least one injury last year. The suicide rate among ex-NFL players is six times the national average, according to, a Web site dedicated to helping former players adjust to retirement. A recent clinical survey found they are three times more likely than other men their age to abuse prescription medication.

Say a guy gets drafted and meets the average, plays for three and a half years. Let's be generous and award him the median salary. He should walk away with at least a cool $2.4 million.


Hold on. Three percent off the top goes to his agent. Slice off another 40 percent because he's in the highest tax bracket. So there goes 43 cents on the dollar.

He also has to pay a financial adviser, and he's got legal fees.

He needs a specialized personal trainer, too, because his body is his living, plus training equipment, nutritional supplements, and a good computer to study game tape on, all at peril of being judged overpaid.

Some of this he can write off, if he remembers to keep the receipts, but the IRS tends to be strict and audits about 20 percent of all NFL players - perhaps because they're reportedly so overpaid.

A job in the NFL is not a Hallmark card, and it's not nearly as secure as most union jobs. It's a grinding, dangerous, painful, short-lived pursuit, so abbreviated that it hardly qualifies as a profession in the way the rest of us define the word, and it comes at a heavy, heavy cost.

Whenever you're tempted to yell at a player to try working for living, or to go dig a ditch, remember that by age 50 he may not be able to.

"When we think about a union job, usually they work 25 or 30 years," says John Hogan, an Atlanta-based attorney who represents NFL players in disability cases. "Although in the public sector it's been abused a bit, when you think about a good job with a union you're set for life with pension and disability. And that's where the players' union comes up short, in light of the fact that they play such a brutal game. I just think they haven't shown the leadership of fully providing for you for your lifetime, not just a few years."

The unfortunate truth is for all of their skirmishing, neither the owners nor the NFLPA are serving the longtime interests of players particularly well. Both the union and owners have concentrated their public statements on the league's $9.3 billion in revenue, and how to split it.

There has been less attention devoted to the so-called smaller issue of health care. There are rumors that the union may ask for the health care plan to be extended from five years to 10 years.

But that's still inadequate, according to Hogan.

"I almost think that's worse because a lot of the disabilities we see manifest themselves 10 to 15 years after you stop playing, the cognitive, and the degenerative. Five years doesn't cover it, and extending it to 10 doesn't cover it. And 10 years out, they're probably liable to be less insurable. I'd like to see some sort of lifetime medical for their football-related problems."

While NFL owners and players were haggling over pay cuts and revenues with a federal mediator this week, a former Super Bowl champion named Dave Duerson shot himself in the chest to save his concussed brain for science, and by the way, he died bankrupt too.

Duerson was not some overpaid deadbeat or goldbricker. He was a four-time Pro Bowler, a Notre Dame grad with an economics degree, and the Walter Payton Man of the Year. He should have been one of the league's most successful alums. How could such a thing happen?

Duerson's circumstances were not unlike those described above. After he retired from the NFL he started a food business that was a success for a while, but then went south.

A lawsuit he won never paid off. He owed the IRS $47,000 in back taxes, another $70,000 in a divorce settlement, and $9 million on a business loan. Last year, he made just $16,800, and he feared he might have chronic traumatic encephalopathy, the concussion-related disease.

While we can't know exactly what drove Duerson to take his life, we can assume he was in both physical and psychological pain.

Recently researchers at Washington University School of Medicine in St. Louis released the results of interviews with 644 former players who played in the league between 1979 and 2006, who averaged 48 years in age. The survey found 93 percent of them suffered some level of pain, and 73 percent described that pain as moderate to severe.

This is not just a player concern. It concerns all fans, because as taxpayers we are all taking on the cost of medical care for ex-players who can't get health insurance, because they are forced to turn to Medicare.

With pre-existing conditions that were more than likely inflicted by football, a majority of them are uninsurable.

When a player needs spinal surgery, or a hip replacement, taxpayers are picking up the tab.

So maybe it would be better for us if they really were overpaid.

Rubicon: A river in Wisconsin

By Charles Krauthammer
The Washington Post
Friday, February 25, 2011

The magnificent turmoil now gripping statehouses in Wisconsin, Ohio, Indiana and soon others marks an epic political moment. The nation faces a fiscal crisis of historic proportions and, remarkably, our muddled, gridlocked, allegedly broken politics have yielded singular clarity.

At the federal level, President Obama's budget makes clear that Democrats are determined to do nothing about the debt crisis, while House Republicans have announced that beyond their proposed cuts in discretionary spending, their April budget will actually propose real entitlement reform. Simultaneously, in Wisconsin and other states, Republican governors are taking on unsustainable, fiscally ruinous pension and health-care obligations, while Democrats are full-throated in support of the public-employee unions crying, "Hell, no."

A choice, not an echo: Democrats desperately defending the status quo; Republicans charging the barricades.

Wisconsin is the epicenter. It began with economic issues. When Gov. Scott Walker proposed that state workers contribute more to their pension and health-care benefits, he started a revolution. Teachers called in sick. Schools closed. Demonstrators massed at the capitol. Democratic senators fled the state to paralyze the Legislature.

Unfortunately for them, that telegenic faux-Cairo scene drew national attention to the dispute - and to the sweetheart deals the public-sector unions had negotiated for themselves for years. They were contributing a fifth of a penny on a dollar of wages to their pensions and one-fourth what private-sector workers pay for health insurance.

The unions quickly understood that the more than 85 percent of Wisconsin not part of this privileged special-interest group would not take kindly to "public servants" resisting adjustments that still leave them paying less for benefits than private-sector workers. They immediately capitulated and claimed they were only protesting the other part of the bill, the part about collective-bargaining rights.

Indeed. Walker understands that a one-time giveback means little. The state's financial straits - a $3.6 billion budget shortfall over the next two years - did not come out of nowhere. They came largely from a half-century-long power imbalance between the unions and the politicians with whom they collectively bargain.

In the private sector, the capitalist knows that when he negotiates with the union, if he gives away the store, he loses his shirt. In the public sector, the politicians who approve any deal have none of their own money at stake. On the contrary, the more favorably they dispose of union demands, the more likely they are to be the beneficiary of union largess in the next election. It's the perfect cozy setup.

To redress these perverse incentives that benefit both negotiating parties at the expense of the taxpayer, Walker's bill would restrict future government-union negotiations to wages only. Excluded from negotiations would be benefits, the more easily hidden sweeteners that come due long after the politicians who negotiated them are gone. The bill would also require that unions be recertified every year and that dues be voluntary.

Recognizing this threat to union power, the Democratic Party is pouring money and fury into the fight. Fewer than 7 percent of private-sector workers are unionized. The Democrats' strength lies in government workers, who now constitute a majority of union members and provide massive support to the party. For them, Wisconsin represents a dangerous contagion.

Hence the import of the current moment - its blinding clarity. Here stand the Democrats, avatars of reactionary liberalism, desperately trying to hang on to the gains of their glory years - from unsustainable federal entitlements for the elderly enacted when life expectancy was 62 to the massive promissory notes issued to government unions when state coffers were full and no one was looking.

Obama's Democrats have become the party of no. Real cuts to the federal budget? No. Entitlement reform? No. Tax reform? No. Breaking the corrupt and fiscally unsustainable symbiosis between public-sector unions and state governments? Hell, no.

We have heard everyone - from Obama's own debt commission to the chairman of the Joint Chiefs of Staff - call the looming debt a mortal threat to the nation. We have watched Greece self-immolate. We can see the future. The only question has been: When will the country finally rouse itself?

Amazingly, the answer is: now. Led by famously progressive Wisconsin - Scott Walker at the state level and Budget Committee Chairman Paul Ryan at the congressional level - a new generation of Republicans has looked at the debt and is crossing the Rubicon. Recklessly principled, they are putting the question to the nation: Are we a serious people?

Thursday, February 24, 2011

Today's Tune: Steve Earle - Galway Girl (Live)

For Berra and Guidry, It Happens Every Spring

The New York Times
February 23, 2011

Edward Linsmier for The New York Times

When Yogi Berra arrived on Tuesday afternoon at Tampa International Airport, Ron Guidry was waiting for him.

TAMPA, Fla. — With all the yearly changes made by the Yankees, Yogi Berra’s arrival at their spring training base adds a timeless quality to baseball’s most historic franchise.

Berra, the catching legend and pop culture icon, slips back into the uniform with the famous and familiar No. 8. He checks into the same hotel in the vicinity of George M. Steinbrenner Field and requests the same room. He plans his days methodically — wake up at 6 a.m., breakfast at 6:30, depart for the complex by 7 — and steps outside to be greeted by the same driver he has had for the past dozen years.

The driver has a rather famous name, and nickname, as well.

“It’s like I’m the valet,” said Ron Guidry, the former star pitcher known around the Yankees as Gator for his Louisiana roots. “Actually, I am the valet.”

When Berra arrived on Tuesday afternoon from New Jersey for his three- to four-week stay, Guidry, as always, was waiting for him at Tampa International Airport. Since Berra forgave George Steinbrenner in 1999 for firing him as the manager in 1985 through a subordinate and ended a 14-year boycott of the team, Guidry has been his faithful friend and loyal shepherd.

Guidry had a custom-made cap to certify his proud standing. The inscription reads, “Driving Mr. Yogi.”

“He’s a good guy,” Berra, the Yankees’ 85-year-old honorary patriarch, said during an interview at his museum in Little Falls, N.J. “We hang out together in spring training.”

By “hanging out,” Berra means being in uniform with the Yankees by day and having dinner with Guidry by night. That is, until Guidry, who loves to cook and rents a two-bedroom apartment across the road from where Berra stays, demands a break from their spring training rotation of the five restaurants that meet Berra’s approval.

“See, I really love the old man, but because of what we share — which is something very special — I can treat him more as a friend and I can say, ‘Get your butt in my truck or you’re staying,’ ” Guidry said. “He likes that kind of camaraderie, wants to be treated like everybody else, but because of who he is, that’s not how everybody around here treats him.

“So I’ll say, ‘Yogi, tonight we’re going to Fleming’s, then to Lee Roy Selmon’s tomorrow, and then the night after that you stay in your damn room, have a ham sandwich or whatever, because the world doesn’t revolve around you and I’m taking a night off.’ ”

Berra played 18 years for the Yankees, from 1946 to 1963, and was part of 10 World Series champions. Guidry pitched from the mid-1970s through 1988, played on two World Series winners and was a Cy Young Award winner in 1978, when he was 25-3 with a 1.74 earned run average.

While Guidry was blossoming into one of baseball’s premier left-handers, Berra was a coach on Manager Billy Martin’s staff (and later became Guidry’s manager). They dressed at adjacent stalls in the clubhouse of the old Yankee Stadium. Eager to learn, Guidry would pepper Berra with questions about what he, as a former catcher, thought of hitters.

Berra would say, “You got a great catcher right over there,” nodding in the direction of Thurman Munson. But Guidry persisted, and their bond was formed.

During Berra’s self-imposed absence, Guidry saw him only on occasion, at card-signing shows and at Berra’s charity golf tournament near his home in Montclair, N.J. When Berra returned, the retired players he knew best were no longer part of the spring training instructional staff.

“There was really nobody else that he had to sit and talk with, to be around after the day at the ballpark,” Guidry said. “So I just told him, ‘I’ll pick you up, we’ll go out to supper,’ and that’s how it started. It wasn’t like I planned it. It just developed.”

Barton Silverman/The New York Times

Yogi Berra with Ron Guidry at Yankees spring training in Tampa, Fla., on Wednesday.

In offering his companionship, Guidry discovered that he was the luckier side of the partnership spanning generations of Yankees greatness.

“I never got to pitch against Ted Williams, for example,” Guidry said. “I’d say, ‘Yogi, when you guys had to go to Boston and you had to face Williams, how did you work him?’ You know, he’s like an encyclopedia, and that’s what I loved, all the stories and just being with him. If he’s not the most beloved man in America, I don’t know who is.”

Berra’s wife, Carmen, typically joins her husband in Tampa during spring training, but charity and family obligations generally limit her time here to a few days. Guidry, she said, has been “so special to Yogi, like a member of the family.”

He has asked Berra to stay with him in his apartment, but Berra prefers the hotel.

“I mean, the only time we’re really not together is when he’s asleep,” Guidry said. “But you can’t get him out of there because that’s how it’s been. You can’t change him. When he does it one day, it’s going to be that way for the next 1,000 days.”

Berra was 73 when he rejoined the Yankees family, but his rigid need for routines had little to do with his age, said Carmen Berra, his wife of 62 years.

“That’s always been Yogi,” she said. “If the doctor tells him to take a pill at 9 a.m., the bottle is open at 5 of 9.”

That is why Guidry considers his supreme achievement in their dozen years as the Yankees’ odd couple to be the day — he guessed it was five years ago — that he persuaded Berra to try a Cajun culinary staple.

Every spring, Guidry brings from his home near Lafayette, La., about 200 frog legs and a flour mix to fry them. One day, he took a batch to the clubhouse to share with the former pitching coach Mel Stottlemyre, turned to Berra and said, “Try these.”

Berra shook his head, as if Guidry were offering him tofu.

Guidry told him, “You don’t try it, we’re not going out to supper tonight.”

Berra relented, and soon a dinner of frog legs, green beans wrapped in bacon and a sweet potato at Guidry’s apartment — usually timed to a weekend of N.C.A.A. basketball tournament games — became as much a rite of spring as pitchers and catchers.

“He calls me at home this year to remind me about the frogs’ legs — ‘Did you get ’em yet?’ ” Guidry said. “I said, ‘Yogi, it’s freaking January, calm down.’ ”

Though Berra often calls Guidry during the off-season, he has never visited him in Louisiana. “He lives in the swamps, you know,” Berra said.

When Guidry was the Yankees’ pitching coach in 2006 and 2007, Berra could count on him being in spring training. Now Guidry must receive an invitation from the Yankees, which he and Berra anxiously await.

During exhibition games, they sit on the bench together, in the corner by the water cooler, studying the game. “Every once in a while, Yogi will see something about a guy and think that he can help,” Guidry said.

Last season, Berra noticed that pitchers were getting Nick Swisher out with breaking balls and mentioned to Guidry that he thought Swisher might try moving up in the batter’s box to attack the pitch sooner.

“Tell him, not me,” Guidry said.

“Nah, I don’t want to bother him,” Berra said.

After Swisher grounded out, he walked past Guidry and Berra in the dugout. Guidry stood up, pointed at Berra. “He wants to talk to you,” Guidry said. Swisher sat down, heard Berra out and doubled off the wall in his next at-bat. After he scored, he returned to the dugout and parked himself alongside Berra.

“For Yogi, that meant everything,” Guidry said. “Now who knows if that had anything to do with the great season Swisher had? But in Yogi’s mind, he made a friend and he felt, ‘O.K., that justifies me being here,’ even though everybody loves having him here anyway.

“But that’s the thing — for Yogi, spring training is his last hold on baseball,” Guidry added. “When he walks through that door in the clubhouse, sits at the locker, puts on his uniform, talks to everybody, jokes around, watches batting practice, goes back in, has something to eat, and then he and I will go on the bench and watch the game, believe me, I know how much he really looks forward to it.”

Since taking a fall outside his home last summer that required hospitalization and a period of inactivity, Berra has slowed. His voice is softer. His words seem to be sparser.

“I know Carmen feels he’s going to be fine and occupied because I’m around,” Guidry said. “But this year may be harder than the rest because of what happened. I’m just going to have to watch a little more closely to see what he can do.”

The first item on Berra’s agenda, he said, would be to go shopping.

“He buys his roast beef, I buy my bottle of vodka,” Berra said, with a twinkle in his eye. “We get along real good.”

Wednesday, February 23, 2011

Today's Tune: Rolling Stones - Around and Around

B. N. Nathanson, 84, Dies; Changed Sides on Abortion

The New York Times
Published: February 21, 2011

Dr. Bernard N. Nathanson, a campaigner for abortion rights who, after experiencing a change of heart in the 1970s became a prominent opponent of abortion and the on-screen narrator of the anti-abortion film “The Silent Scream,” died on Monday at his home in Manhattan. He was 84.

Dr. Bernard N. Nathanson (AP)

The cause was cancer, said his wife, Christine.

Dr. Nathanson, an obstetrician-gynecologist practicing in Manhattan, helped found the National Association for the Repeal of Abortion Laws (now NARAL Pro-Choice America) in 1969 and served as its medical adviser.

After abortion was legalized in New York in 1970, he became the director of the Center for Reproductive and Sexual Health, which, in his talks as an abortion opponent, he often called “the largest abortion clinic in the Western world.”

In a widely reported 1974 article in The New England Journal of Medicine, “Deeper into Abortion,” Dr. Nathanson described his growing moral and medical qualms about abortion. “I am deeply troubled by my own increasing certainty that I had in fact presided over 60,000 deaths.”[1]

His unease was intensified by the images made available by the new technologies of fetoscopy and ultrasound.

“For the first time, we could really see the human fetus, measure it, observe it, watch it, and indeed bond with it and love it,” he later wrote in “The Hand of God: A Journey from Death to Life by the Abortion Doctor Who Changed His Mind” (Regnery Publishing, 1996). “I began to do that.”

Despite his misgivings, and his conviction that abortion on demand was wrong, he continued to perform abortions for reasons he deemed medically necessary.

“On a gut, emotional level, I still favored abortion,” he told New York magazine in 1987. “It represented all the things we had fought for and won. It seemed eminently more civilized than the carnage that had gone on before.”

But, he added, “it was making less and less sense to me intellectually.”

In addition to the 60,000 abortions performed at the clinic, which he ran from 1970 to 1972, he took responsibility for 5,000 abortions he performed himself, and 10,000 abortions performed by residents under his supervision when he was the chief of obstetrical services at St. Luke’s Hospital in Manhattan from 1972 to 1978.

He did his last procedure in late 1978 or early 1979 on a longtime patient suffering from cancer and soon embarked on a new career lecturing and writing against abortion.

“The Silent Scream,” a 28-minute film produced by Crusade for Life, was released in early 1985. In it, Dr. Nathanson described the stages of fetal development and offered commentary as a sonogram showed, in graphic detail, the abortion of a 12-week-old fetus by the suction method.

“We see the child’s mouth open in a silent scream,” he said, as the ultrasound image, slowed for dramatic impact, showed a fetus seeming to shrink from surgical instruments. “This is the silent scream of a child threatened imminently with extinction.”

The film won the enthusiastic praise of President Ronald Reagan, who showed it at the White House, and was widely distributed by anti-abortion groups like the National Right to Life Committee.

Supporters of abortion rights and many physicians, however, criticized it as misleading and manipulative. Some medical experts argued that a 12-week-old fetus cannot feel pain since it does not have a brain or developed neural pathways, and that what the film showed was a purely involuntary reaction to a stimulus.

Dr. Nathanson accused his critics of rationalizing. Responding to a doctor from Cornell’s medical school on the television program “Nightline,” he said, “If pro-choice advocates think that they’re going to see the fetus happily sliding down the suction tube waving and smiling as it goes by, they’re in for a truly paralyzing shock.”

He later produced another film, “Eclipse of Reason,” about a late-term procedure that critics call partial-birth abortion. In the 1980’s he wrote “Aborting America” (Pinnacle Books, 1981), a memoir and social history of the abortion rights movement, and, with Adelle Nathanson, “The Abortion Papers: Inside the Abortion Mentality” (Hawkes Publishing, 1984)

Bernard N. Nathanson, the son of an obstetrician-gynecologist, was born on July 31, 1926, in Manhattan and grew up on the Upper West Side. He earned a bachelor’s degree from Cornell and a medical degree from McGill University in 1949.

After serving as the chief of obstetrics and gynecology for the Northeast Air Command of the Air Force, he established a successful practice in Manhattan.

While interning at Woman’s Hospital in Manhattan, he observed the effects of illegal abortions on the mostly poor black and Hispanic women who came under his care, and he soon became convinced that the laws prohibiting abortion must be changed. In 1967, he met Lawrence Lader, a crusading journalist and the author of “Abortion,” and soon became caught up in Mr. Lader’s plans to organize a movement to agitate for the repeal of laws prohibiting abortions.

Dr. Nathanson earned a degree in bioethics from Vanderbilt University in 1996 and that year was baptized as a Roman Catholic — he described himself up to that time as a Jewish atheist — in a private ceremony at St. Patrick’s Cathedral by Cardinal John J. O’Connor, the archbishop of New York.

His first three marriages ended in divorce. In addition to his fourth wife, Christine, he is survived by a son, Joseph, of New Jersey.

In addressing anti-abortion audiences, Dr. Nathanson often drew gasps by painting himself, in his pro-abortion-rights days, in lurid colors.

“I know every facet of abortion,” he wrote in his memoir, adding, “I helped nurture the creature in its infancy by feeding it great draughts of blood and money; I guided it through its adolescence as it grew fecklessly out of control.”

A version of this article appeared in print on February 22, 2011, on page B14 of the New York edition.



Public Unions Must Go

A death knell for government unions? If only.

By Jonah Goldberg
February 23, 2011 12:00 A.M.

The protesting public-school teachers with fake doctor’s notes swarming the capitol building in Madison, Wis., insist that Gov. Scott Walker is hell-bent on “union busting.” Walker denies that his effort to reform public-sector unions in Wisconsin is anything more than an honest attempt at balancing the state’s books.

I hope the protesters are right. Public unions have been a 50-year mistake.

A crucial distinction has been lost in the debate over Walker’s proposals: Government unions are not the same thing as private-sector unions.

Traditional, private-sector unions were born out of an often-bloody adversarial relationship between labor and management. It’s been said that during World War I, U.S. soldiers had better odds of surviving on the front lines than miners did in West Virginia coal mines. Mine disasters were frequent; hazardous conditions were the norm. In 1907, the Monongah mine explosion claimed the lives of 362 West Virginia miners. Day-to-day life often resembled serfdom, with management controlling vast swaths of the miners’ lives. Before unionization and many New Deal–era reforms, Washington had little power to reform conditions by legislation.

Government unions have no such narrative on their side. Do you recall the Great DMV Cave-in of 1959? How about the travails of second-grade teachers recounted in Upton Sinclair’s famous schoolhouse sequel to The Jungle? No? Don’t feel bad, because no such horror stories exist.

Government workers were making good salaries in 1962 when President Kennedy lifted, by executive order (so much for democracy), the federal ban on government unions. Civil-service regulations and similar laws had guaranteed good working conditions for generations.

The argument for public unionization wasn’t moral, economic, or intellectual. It was rankly political.

Traditional organized labor, the backbone of the Democratic party, was beginning to lose ground. As Daniel DiSalvo wrote in “The Trouble with Public Sector Unions,” in the fall issue of National Affairs, JFK saw how in states such as New York and Wisconsin, where public unions were already in place, local liberal pols benefited politically and financially. He took the idea national.

The plan worked perfectly — too perfectly. Public-union membership skyrocketed, and government-union support for the party of government skyrocketed with it. From 1989 to 2004, AFSCME — the American Federation of State, County, and Municipal Employees — gave nearly $40 million to candidates in federal elections, with 98.5 percent going to Democrats, according to the Center for Responsive Politics.

Why would local government unions give so much in federal elections? Because government workers have an inherent interest in boosting the amount of federal tax dollars their local governments get. Put simply, people in the government business support the party of government. Which is why, as the Manhattan Institute’s Steven Malanga has been chronicling for years, public unions are the country’s foremost advocates for increased taxes at all levels of government.

And this gets to the real insidiousness of government unions. Wisconsin labor officials fairly note that they’ve acceded to many of their governor’s specific demands — that workers contribute to their pensions and health-care costs, for example. But they don’t want to lose the right to collective bargaining.

But that is exactly what they need to lose.

Private-sector unions fight with management over an equitable distribution of profits. Government unions negotiate with friendly politicians over taxpayer money, putting the public interest at odds with union interests, and, as we’ve seen in states such as California and Wisconsin, exploding the cost of government. California’s pension costs soared 2,000 percent in a decade thanks to the unions.

The labor-politician negotiations can’t be fair when the unions can put so much money into campaign spending. Victor Gotbaum, a leader in the New York City chapter of AFSCME, summed up the problem in 1975 when he boasted, “We have the ability, in a sense, to elect our own boss.”

This is why FDR believed that “the process of collective bargaining, as usually understood, cannot be transplanted into the public service,” and why even George Meany, the first head of the AFL-CIO, held that it was “impossible to bargain collectively with the government.”

As it turns out, it’s not impossible; it’s just terribly unwise. It creates a dysfunctional system where for some, growing government becomes its own reward. You can find evidence of this dysfunction everywhere. The Cato Institute’s Michael Tanner notes that federal education spending has risen by 188 percent in real terms since 1970, but we’ve seen no significant improvement in test scores.

The unions and the protesters in Wisconsin see Walker’s reforms as a potential death knell for government unions. My response? If only.

— Jonah Goldberg is editor-at-large of National Review Online and a visiting fellow at the American Enterprise Institute. © 2011 Tribune Media Services, Inc.

Tuesday, February 22, 2011

Who Attacked Lara Logan, and Why?

The answer is obvious — but nobody talks about it.

By Andrew C. McCarthy
February 22, 2011 4:00 A.M.

In this Feb. 11, 2001 photo released by CBS, "60 Minutes" correspondent Lara Logan is shown covering the reaction in in Cairo's Tahrir Square the day Egyptian President Hosni Mubarak stepped down.(AP)

For the world’s billion-plus Sunni Muslims, al-Azhar University in Cairo is the center of the theological universe, its faculty and scholars the most authoritative voice on the meaning of Islam. It is not very far from Tahrir Square, ground zero of Egypt’s revolution.

It was in Tahrir Square last Friday that the Muslim Brotherhood began shunting aside other opposition leaders, including Google executive Wael Ghonim. The million Muslims jamming the square hadn’t turned out to hear a good corporate citizen of the Left. In this nation, where a strong majority of the population desires the implementation of sharia, Islam’s legal and political system, the throng turned out to hear and hail Sheikh Yusuf Qaradawi, the Brotherhood’s top adviser — who, with his al-Azhar doctorate in Islamic jurisprudence, is sharia personified.

Tahrir Square is also the place where, in the frenzy after Hosni Mubarak’s fall, CBS news correspondent Lara Logan was seized and subjected to a savage sexual assault by an Egyptian gang. Coverage of the attack has been muted. There have been testimonials to Ms. Logan’s courage, and one anti-American leftist lost his comfortable fellowship at NYU Law School for failing to conceal his glee over the atrocity. We have heard much about the attack, but have heard next to nothing about the attackers. You are just supposed to assume it was a “mob” — the sort of thing that could have happened in any setting where raw emotion erupts, say, Wisconsin’s capitol.

Except it doesn’t happen in Madison. It happens in Egypt. It happened in Indonesia, the world’s most populous Muslim country, in the riots that led to Suharto’s fall — as Sharon Lapkin recounts, human-rights groups interviewed more than 100 women who had been captured and gang raped, including many Chinese women, who were told this was their fate as non-Muslims.[1] It happens in Muslim countries and in the Muslim enclaves of Europe and Australia, perpetrated by Islamic supremacists acting on a sense of entitlement derived from their scriptures, fueled by the rage of their jihad, and enabled by the deafening silence of the media.

As Jihad Watch director Robert Spencer has detailed, al-Azhar University endorses a sharia manual called Umdat al-Salik.[2] It is quite clear on the subject of women who become captives of Muslim forces: “When a child or a woman is taken captive, they become slaves by the fact of capture, and the woman’s previous marriage is immediately annulled.” This is so the woman can then be made a concubine of her captor.

This arrangement is encouraged by the Koran. Sura 4:23–24, for example, forbids Muslim men from consorting with the wives of other Muslims but declares sexual open season on any women these men have enslaved. (“Forbidden to you are . . . married women, except those whom you own as slaves.”) Moreover, Mohammed — whose life Muslims are exhorted by scripture to emulate — rewarded his fighters by distributing as slaves the women of the Jewish Qurazyzah tribe after Muslim forces had beheaded their husbands, fathers, and sons. The prophet himself also took one of the captured women, Rayhanna, as his concubine. And, as Spencer further notes, Mohammed directed his jihadists that they should not practice coitus interruptus with their slaves — they were encouraged to ravish them, but only in a manner that might produce Muslim offspring.

As I documented in an earlier column, Sheikh Qaradawi contends that women bring sexual abuse on themselves if they fail to conform to Islamist conventions of modest dress.[3] Shahid Mehdi, a top Islamic cleric in Denmark, has explained that women who fail to don a headscarf are asking to be raped, an admonition echoed by Sheikh Faiz Mohammed, a prominent Lebanese cleric, during a lecture he delivered in Australia.

In light of these exhortations, should it be any surprise that the sexual abuse of women is Islam’s silent scandal? In Europe’s expanding Muslim enclaves, it is a terror tactic to extort women — Muslim and non-Muslim — into adopting the hijab and other Islamic sartorial standards. Rape has become so prevalent, and so identifiably a Muslim scourge, that embarrassed and hyper–politically correct Swedish authorities have discouraged police in cities such as heavily Muslim Malmo from collecting data that point to Islam as the common denominator in rape reports.

We can keep ignoring it, we can hope against hope for a reformation (while continuing to pretend that the reformation has already happened). The fact, however, is that, as long as al-Azhar and figures like Qaradawi continue to be the voice of Islam — al-Azhar, the site President Obama chose for his June 2009 address to the Muslim world; Qaradawi, whom the State Department has hailed as an “intelligent and thoughtful voice from the region . . . an important figure that deserves our attention” — Islam will not change, and women will be little more than chattel.

It is a challenge we do not want to acknowledge, because the Islamic scholars have doctrine on their side. The Koran pronounces that “Allah has made men superior to women” (Sura 4:34). As documented in “Sharia Law for Non-Muslims,” a study published by the Center for the Study of Political Islam, Mohammed declared that women are inferior to men in both intelligence and religious devotion (Bukhari hadith 1.6.301), and that women will make up most of those condemned to Hell. (Bukahri 7.62.132). Sexual abuse is encouraged not only by hadith but — as I related in discussing the recent case of a teenager flogged to death in Bangladesh[4] — by sharia standards that make rape practically impossible to prove and subject women to a death sentence for adultery or fornication if they come forward with an accusation but cannot prove it.

Islamic scriptures endorse wife-beating (Koran 4:34 again). Female genital mutilation is rampant in the Muslim world and scripturally based. As Caroline Glick notes, the World Health Organization reports that 97 percent of Egyptian women and girls have been subjected to this form of barbarism.[5]

This despicable treatment is fortified by standards that treat women’s testimony as inferior to men’s, permit men to marry up to four women, and deny women the right to hold many public offices — particularly those that involve the construction of Islamic law and issuance of fatwas.

The unmistakable message at the core of sharia is that women, like non-Muslims, are less than fully human. It is a message we continue ignoring at the peril of tomorrow’s Lara Logans, and our own.

— Andrew C. McCarthy, a senior fellow at the National Review Institute, is the author, most recently, of The Grand Jihad: How Islam and the Left Sabotage America.







Dave Duerson legacy: Look out

Chicago Sun-Times
Feb 22, 2011 05:12AM

Gary Fencik called old teammate Shaun Gayle after he heard about Dave Duerson.

The last two safeties from the Bears’ Super Bowl-winning team had been thinking the same thing: safety Todd Bell died at age 46 of a heart attack while driving his car, and now safety Dave Duerson had killed himself with a gunshot wound to the chest.

“I’m watching you,’’ Fencik, 56, said to Gayle.

“I’m watching you,’’ Gayle, 48, replied.

Dave Duerson holds up his driver’s license that shows he is an organ donor at Soldier Field in 1999 during a memorial for Walter Payton. Duerson, who killed himself Thursday, donated his brain for research. (Tannen Maury/Getty Images)

The chaos caused by Duerson’s suicide Thursday night in Miami, and his last wish that his brain be examined for damage possibly caused by blows he received during his long football career, can hardly be calculated.

In short: if you played a long time in the NFL, look out.

Even if you played just a short time in the NFL, or throughout college or in high school, or even if you played pee-wee ball and received concussions or numerous head-rattling blows, beware.

If Duerson’s demise and horrible death at a young age was in part, or wholly, precipitated by chronic traumatic encephalopathy (CTE), the dementia caused by repeated head blows, who knows how far the ugly tentacles of America’s favorite violent game extend.

“I’m worried,’’ Duerson’s old Bears teammate, Emery Moorehead, said Monday on ESPN’s “Outside the Lines.’’

Report after report at the Boston University School of Medicine’s Center for the Study of Traumatic Encephalopathy is showing the brain decay in certain deceased players’ brains, a condition that causes depression, irrational behavior, intelligence decline, anger and many other emotional issues — up to and including death.

Two years ago when I looked through a microscope with head researcher Dr. Ann McKee at her lab near Boston, I was horrified at the slides of brain tissue that revealed clear and obvious CTE damage. I am no scientist, but I can tell you that the difference between healthy brain cells and the diseased ones of former football players was the difference between prime steak and rotten meat.

“It’s terrible, isn’t it?’’ McKee had said.

A stunning donation

Duerson’s brain swiftly went to the BU research center, and co-director Chris Nowinski, who also founded the Sports Legacy Institute, a nonprofit organization dedicated to concussion awareness, was stunned by the gift.

Duerson had been the man who awarded Nowinski a National Football Foundation $1,000 academic scholarship in 1997, when Nowinski was a senior at Hersey.

“He presented the award to me at a hotel in downtown Chicago,’’ Nowinski said. “I have a photo of him and me shaking hands. Yes, this shocked me. But we’re always shocked.’’

Nowinski is BU’s front man for requests for brains — indeed, in mid-January he got former Bears quarterback Jim McMahon to pledge his brain to the center — but Duerson’s shot to the chest to apparently leave his brain untouched, is not what Nowinski or the staff had in mind.

“Part of the reason we created the brain registry is so that people can receive help rather than take their lives,’’ he explains. “A disease is not a sign of weakness.’’

What would he have preferred Duerson did?

“Call a suicide hotline,’’ he says. “There are people out there who can help. We’re urgently working on treatment, for ways to heal the brain. We want these players to stick around and be there.’’

When I last talked to Duerson, it was outside Soldier Field several years ago. The personal and business issues that would be dragging him down had been reported, and our talk was tainted with that unspoken awareness. Yet Dave was in seemingly good spirits.

Always smart, outspoken

He was chewing on a cigar. He was smart and driven. He was charming and outspoken. The way he always was. But maybe, I thought later, he was not as centered on his issues as he should have been.

“No, not really, I didn’t see a change,’’ says Fencik when asked if he noticed anything about Duerson’s mental state in recent years. “I think he was trying to restart his career in the fast-food franchising business when he went to Florida. But why Florida, I don’t know.

“He was smart, a lot of pride. But with Dave, I don’t know what to say. All of us, we’re replaying, ‘What could we have done?’ Not as teammates, as friends.’’

Duerson was a big hitter, a banger. Those kind don’t even know when they’ve had a concussion. They don’t rest. They’re like sharks.

Duerson’s fall from grace was a sheer one, a plummet.

“He had it all,’’ says longtime Chicago sports voice Chet Coppock. “Notre Dame, two-time All-America, thirrd-round pick, Super Bowl champ, wife, family, beautiful house in Highland Park. Everything in the world going for him. And then the mother of all spirals.’’

Duerson had been Coppock’s presenter when Coppock was named the 1987 Man of the Year by the Chicago chapter of the Italian American Sports Hall of Fame. “So cordial, so gracious,’’ Coppock said. “Always a first-class act.’’

But when it goes, it goes.

It could have been the weight of bankruptcy and loss that caused Duerson’s depression.

Or it could have been the brain damage, something that supposedly was causing him headaches and vision and word loss.

“We won’t know for a couple of months,’’ Nowinski said. “And then it’s up to the family if it gets made public.’’

Dr. Bennet Omalu, a forensic researcher, is generally credited with being the first man to see CTE in a deceased football player’s brain.

“It takes from five to 20 years to manifest itself,’’ Omalu said of the head trauma. “And then the brain cells start dying. And that’s when the people crash.’’

Duerson certainly did that.

Out of Wisconsin, a lesson in leadership for Obama

By George F. Will
The Washington Post
Tuesday, February 22, 2011

Wisconsin Republican Governor Scott Walker speaks to the media about his proposed budget cuts reducing public employee union bargaining powers and benefits in Madison, Wisconsin February 21, 2011. (Reuters)


Hitherto, when this university town and seat of state government applauded itself as "the Athens of the Midwest," the sobriquet suggested kinship with the cultural glories of ancient Greece. Now, however, Madison resembles contemporary Athens.

This capital has been convulsed by government employees sowing disorder in order to repeal an election. A minority of the minority of Wisconsin residents who work for government (300,000 of them) are resisting changes to benefits that most of Wisconsin's 5.6 million residents resent financing.

Serene at the center of this storm sits Republican Scott Walker, 43, in the governor's mansion library, beneath a portrait of Ronald Reagan. Walker has seen this movie before.

As Milwaukee County executive, he had similar dust-ups with government workers' unions, and when the dust settled, he was resoundingly reelected, twice. If his desire to limit collective bargaining by such unions to salary issues makes him the "Midwest Mussolini" - some protesters did not get the memo about the new civility - other supposed offenses include wanting state employees to contribute 5.8 percent of their pay to their pension plans (most pay less than 1 percent), which would still be less than the average in the private sector. He also wants them to pay 12.6 percent of the cost of their health care premiums - up from about 6 percent but still much less than the private-sector average.

He campaigned on this. Union fliers distributed during the campaign attacked his "5 and 12" plan. He says his brother, a hotel banquet manager, and his sister-in-law, who works at Sears, "would love to have" what he is offering the unions.

For some of Madison's graying baby boomers, these protests are a jolly stroll down memory lane. Tune up the guitars! "This is," Walker says, "very much a '60s mentality."

He does, however, think there is sincerity unleavened by information: Many protesters do not realize that most worker protections - merit hiring; just cause for discipline and termination - are the result not of collective bargaining but of Wisconsin's uniquely strong and century-old civil service law.

Kathryn Schulze wears a message written on tape over her mouth inside the state Capitol Monday, Feb. 21, 2011, in Madison, Wis. Opponents to Governor Scott Walker's bill to eliminate collective bargaining rights for many state workers are taking part in their seventh day of protesting.
(AP Photo/Jeffrey Phelps)

"I am convinced," he says, "this is about money - but not the employees' money." It concerns union dues, which he wants the state to stop collecting for the unions, just as he wants annual votes by state employees on re-certifying the unions. He says many employees pay $500 to $600 annually in union dues - teachers pay up to $1,000. Given a choice, many might prefer to apply this money to health care premiums or retirement plans. And he thinks "eventually" most will say about the dues collectors, "What do we need this for?"

Such unions are government organized as an interest group to lobby itself to do what it always wants to do anyway - grow. These unions use dues extracted from members to elect their members' employers. And governments, not disciplined by the need to make a profit, extract government employees' salaries from taxpayers. Government sits on both sides of the table in cozy "negotiations" with unions.

A few days after President Obama submitted a budget that would increase the federal deficit, he tried to sabotage Wisconsin's progress toward solvency. The Washington Post: "The president's political machine worked in close coordination . . . with state and national union officials to mobilize thousands of protesters to gather in Madison and to plan similar demonstrations in other state capitals." Walker notes that in the 1990s, Wisconsin was a trendsetter regarding school choice and welfare reform. Obama, he thinks, may be worried that Wisconsin might again be a harbinger.

He also thinks Obama's intervention demonstrates why presidents should serve apprenticeships as governors. He says that Obama, in the Illinois Legislature and the U.S. Senate, "was a liberal among liberals," and liberals are his base, and his staff comes from it. Governors, Walker says, get used to considering the interests of broad constituencies.

Walker's calm comportment in this crisis is reminiscent of President Reagan's during his 1981 stand against the illegal strike by air traffic controllers, and Margaret Thatcher's in the 1984 showdown with the miners' union over whether unions or Parliament would govern Britain. Walker, by a fiscal seriousness contrasting with Obama's lack thereof, and Obama, by inciting defenders of the indefensible, have made three things clear:

First, the Democratic Party is the party of government, not only because of its extravagant sense of government's competence and proper scope, but also because the party's base is government employees. Second, government employees have an increasingly adversarial relationship with the governed. Third, Obama's "move to the center" is fictitious.

Monday, February 21, 2011

Sanity and Sanctity: The Ennobling Fantasy of J.R.R. Tolkien Part 1

by Leo Grin
Big Hollywood
February 19, 2011

“Oh f***, not another elf!”

Thus exclaimed English academic Hugo Dyson as his friend J.R.R. Tolkien prepared to read aloud the latest chapter in his then-unpublished “heroic romance” to a small audience of intimates in the pleasantly smoke-filled, gin-scented rooms of C. S. Lewis. Years earlier, during a fateful night of impassioned debate, it was Dyson and Tolkien who together convinced Lewis to forsake unbelief and embrace Christianity, doing such a good job of it that the future author of The Chronicles of Narnia would become the most influential Christian vindicator (I despise the word apologist) of the twentieth century.

Now Dyson was mocking the work of the man who would become the most influential purveyor of Christianized fiction of that same century, and many of Tolkien’s fellow Inklings were of the same mind. It was thus left to Lewis to spur the author of The Hobbit on to greater heights of imagination. “If they won’t write the kind of books we want to read, we shall have to write them ourselves,” he once told Tolkien, and that’s just what they did. Each used the medium known (fondly to some, pejoratively to most) as “fairy stories” to achieve the tang and ring and chime — and through them the thoughts and feelings and beliefs — that they were seeking in literature.

In between his increasingly unpopular Inkling readings, Tolkien wrote during snatches of time carved out of days filled with exhausting academic duties, and frequently only after penning worried, often melancholy letters to his sons off to war. “I sometimes feel appalled,” he admitted in one 1944 missive, “at the thought of the sum total of human misery all over the world at the present moment. . . If anguish were visible, almost the whole of this benighted planet would be enveloped in a dense dark vapour, shrouded from the amazed visions of the heavens! And the products of it all will be mainly evil….” In another he lamented that, “A small knowledge of history depresses one with the sense of the everlasting mass and weight of human iniquity: old, old, dreary, endless repetitive unchanging incurable wickedness. All towns, all villages, all habitations of men — sinks! . . . We do so little that is positive good, even if we negatively avoid what is actively evil. It must be terrible to be a priest!”

And yet, he also possessed shadowed hope: “At the same time one knows that there is always good: much more hidden, much less clearly discerned, seldom breaking out into recognizable, visible beauties of word or deed or face — not even when in fact sanctity, far greater than the visible advertised wickedness, is really there.”

Finding that quiet sanctity amidst clangorous wickedness and despair would become the defining characteristic of The Lord of the Rings.

To his youngest boy, Christopher (b. 1924, and then stationed in South Africa with the Royal Air Force), he regularly sent new chapters of his burgeoning magnum opus, along with news that, when he read each aloud to C. S. Lewis, the author of Mere Christianity and so many other kindly, bracing works would sometimes be moved to tears. “[Lewis] was for long my only audience,” Tolkien wrote later with deep appreciation. “Only from him did I ever get the idea that my ‘stuff’ could be more than a private hobby. But for his interest and unceasing eagerness for more I should never have brought The L. of the R. to a conclusion.”


All about the hills the hosts of Mordor raged. The Captains of the West were foundering in a gathering sea. The sun gleamed red, and under the wings of the Nazgûl the shadows of death fell dark upon the earth. Aragorn stood beneath his banner, silent and stern, as one lost in thought of things long past or far away; but his eyes gleamed like stars that shine the brighter as the night deepens. Upon the hill-top stood Gandalf, and he was white and cold and no shadow fell on him. The onslaught of Mordor broke like a wave on the beleaguered hills, voices roaring like a tide amid the wreck and crash of arms.


“I have not been nourished by English Literature,” Tolkien once wrote, “. . . for the simple reason that I have never found much there in which to rest my heart (or heart and head together). I was brought up in the Classics, and first discovered the sensation of literary pleasure in Homer.”

While browsing through a dusty old college library as a teen, young “Ronald” Tolkien discovered a veritable Ring of Power in the form of a book on Finnish grammar. Learning that language, he would later marvel, was “like discovering a complete wine-cellar filled with bottles of an amazing wine of a kind and flavour never tasted before.” Soon his study of other languages gave him a “sensibility to linguistic pattern which affects me emotionally like colour or music,” and he began penning stories and poems in a genuine, rigorously applied archaic mode, deeming our more garish modern idiom as possessing “an insincerity of thought, a disunion of word and meaning” whenever it was applied to tales of high romance. In Tolkien’s view, you couldn’t drink vintage spirits out of a soda pop can without it fatally marring the taste and experience.

At the same time, many old myths were missing something important. “I do know Celtic things (many in their original languages Irish and Welsh),” he once explained by way of example, “and feel for them a certain distaste: largely for their fundamental unreason. They have bright colour, but are like a broken stained glass window reassembled without design. They are in fact ‘mad’. . . but I don’t believe I am.”

He thus “set myself a task, the arrogance of which I fully recognized and trembled at: being precisely to restore to the English an epic tradition and present them with a mythology of their own.”

By the time the 1930s gave way to the 40s and then the 50s, Tolkien began to quietly despair at ever accomplishing his quest. “I have produced a monster,” he wrote to one correspondent, “an immensely long, complex, rather bitter, and very terrifying romance, quite unfit for children (if fit for anybody).” In 1953, while checking galley-proofs, he admitted that it “seems, I must confess, in print very long-winded in parts.” Over fifteen years after beginning his “arrogant” task, he was left to grimly muse that:

Hardly a word in its 600,000 or more has been unconsidered. And the placing, size, style, and contribution to the whole of all its features, incidents, and chapters has been laboriously pondered. I do not say this in recommendation. It is, I feel, only too likely that I am deluded, lost in a web of vain imaginings of not much value to others — in spite of the fact that a few readers have found it good, on the whole.

The first print run of The Fellowship of the Ring was limited to 4,500 copies. “I am dreading the publication,” he wrote, “for it will be impossible not to mind what is said. I have exposed my heart to be shot at.” Many in the entrenched Ivory Towers of academia and literary criticism did just that, offering up scathing critiques that — all too typical of such people — frequently got whole portions of the book (characters, events, dialogue) embarrassingly wrong. In response Tolkien could only sigh, having told his publisher, “It is written in my life-blood, such as that is, thick or thin; and I can no other.”

But one man above all others was determined to (I use a phrase coined by Robert E. Howard) “not be backward when the spears are splintering.” C. S. Lewis took up his pen like a Crusader and, in a review titled “The Gods Return to Earth,” shouted out to the world a written manifestation of the same tears he had shed while first hearing the story read in manuscript:

[The Fellowship of the Ring] is like lightning from a clear sky. . . To say that in it heroic romance, gorgeous, eloquent, and unashamed, has suddenly returned at a period almost pathological in its anti-romanticism, is inadequate. . . Here are beauties which pierce like swords or burn like cold iron; here is a book that will break your heart. . . .

It is sane and vigilant invention, revealing at point after point the integration of the author’s mind. . . Anguish is, for me, almost the prevailing note. But not, as in the literature most typical of our age, the anguish of abnormal or contorted souls; rather that anguish of those who were happy before a certain darkness came up and will be happy if they live to see it gone. . . . But with the anguish comes also a strange exaltation. . . when we have finished, we return to our own life not relaxed but fortified….

Even now I have left out almost everything — the silvan leafiness, the passions, the high virtues, the remote horizons. Even if I had space I could hardly convey them. And after all the most obvious appeal of the book is perhaps also its deepest: “there was sorrow then too, and gathering dark, but great valour, and great deeds that were not wholly vain.”
Not wholly vain — it is the cool middle point between illusion and disillusionment.

Over the ensuing decades Tolkien’s “long, complex, rather bitter, and very terrifying romance” sold millions of copies, and immeasurably enriched the lives of millions of souls, many of whom felt lost and alone in a mad world seemingly bereft of the sanity and the sanctity that his tale embodied. By the time the indispensable scholar and philologist Tom Shippey published his book J.R.R. Tolkien: Author of the Century (2001, with a title meant to be taken as a comment on Tolkien’s thematic essence as much as his popularity) his remaining detractors resembled nothing so much as the routed forces of Mordor, running “hither and thither mindless. . . wailing back to hide in holes and dark lightless places far from hope.”

And from the cimmerian gloom of those dark, lightless places, oh how they snarl! “The Lord of the Rings is racist,” wrote John Yatt in the Guardian in 2002:

It is soaked in the logic that race determines behaviour. . . the races that Tolkien has put on the side of evil are then given a rag-bag of non-white characteristics that could have been copied straight from a BNP [British National Party] leaflet. Dark, slant-eyed, swarthy, broad-faced — it’s amazing he doesn’t go the whole hog and give them a natural sense of rhythm. . . .

[LG -- actually, it was the 1980 Bakshi cartoon that did that: "Where there's a whip!" (ssss...crack!) "There's a way!"]

Begun in the 1930s, published in the 1950s, it’s shot through with the preoccupations and prejudices of its time. This is no clash of noble adversaries like the Iliad, no story of our common humanity like the Epic of Gilgamesh. It’s a fake, a forgery, a dodgy copy. Strip away the archaic turns of phrase and you find a set of basic assumptions that are frankly unacceptable in 21st-century Britain.

What gall. The Guardian is a paper, after all, that later praised fantasy author George RR Martin for his unflinchingly graphic portrayal of a world in which “the old are tortured and humiliated, women are raped, suffering is everywhere,” for his “unsettling passages of bracingly weird sex” and for his myriad scenes of “inventively unpleasant killing.” It’s a paper that later recommended Joe Abercrombie’s first book for its “delightfully twisted and evil” torturer “who can shorten a man’s arm from fingers to elbow in neat little slices,” and for its young hero possessing “no redeeming qualities whatsoever.” Hey, if that’s all to your taste, fair enough.

But does anyone really expect the rest of us to take that same paper seriously when it draws a courageous line in the sand against a mild-mannered Christian professor and his exquisitely rendered masterpiece? To meekly agree that Tolkien sitting on the shelf next to the books of Mssrs. Martin and Abercrombie is “frankly unacceptable” in this evolved new century of tolerance and diversity, lest we be branded racists and throwbacks ourselves? Or to renounce The Lord of the Rings in favor of books overflowing like a backed-up commode with torture-porn, sadism, and nihilism? (Apparently so: the J.R.R. Tolkien Encyclopedia has a meaty entry dedicated to RACISM, CHARGES OF.)

After a lecture, Shippey (whose wonderful Author of the Century book was itself trashed in the Guardian as “a belligerently argued piece of fan-magazine polemic”) was once asked what motivated people to spit such abject nonsense onto Tolkien’s self-professed “life-blood.” A man of eloquence and erudition, he responded with exactly the tone, and at exactly the length, that such diatribes deserve.

“They’re bastards!” he said cheerfully.

Or perhaps we should translate that into words Tolkien would have ruefully recognized, and that adequately express what people with intellectual standards think whenever they open a typical newspaper these days:

“Oh f***, not another liberal critic!”

To be continued. . . .

Death to Apostates: Not a Perversion of Islam, but Islam

The case of Said Musa shows why we cannot graft democracy onto Islamic societies.

By Andrew C. McCarthy
February 19, 2011 4:00 A.M.

On NRO Friday, Paul Marshall lamented the Obama administration’s fecklessness, in particular the president’s appalling silence in the face of the death sentence Said Musa may suffer for the crime of converting to Christianity. This is in Afghanistan, the nation for which our troops are fighting and dying — not to defeat our enemies, but to prop up the Islamic “democracy” we have spent a decade trying to forge at a cost of billions.

This shameful episode (and the certain recurrence of it) perfectly illustrates the folly of Islamic nation-building. The stubborn fact is that we have asked for just these sorts of atrocious outcomes. Ever since 2003, when the thrust of the War On Terror stopped being the defeat of America’s enemies and decisively shifted to nation-building, we have insisted — against history, law, language, and logic — that Islamic culture is perfectly compatible with and hospitable to Western-style democracy. It is not, it never has been, and it never will be.

This is not the first time an apostate in the new American-made Afghanistan has confronted the very real possibility of being put to death by the state. In 2006, a Christian convert named Abdul Rahman was tried for apostasy. The episode prompted a groundswell of international criticism. In the end, Abdul Rahman was whisked out of the country before his execution could be carried out. A fig leaf was placed over the mess: The prospect of execution had been rendered unjust by the (perfectly sane) defendant’s purported mental illness — after all, who in his right mind would convert from Islam? His life was spared, but the Afghans never backed down from their insistence that a Muslim’s renunciation of Islam is a capital offense and that death is the mandated sentence.

They are right. Under the construction of sharia adopted by the Afghan constitution (namely Hanafi, one of Islam’s classical schools of jurisprudence), apostasy is the gravest offense a Muslim can commit. It is considered treason from the Muslim ummah. The penalty for that is death.

This is the dictate of Mohammed himself. One relevant hadith (from the authoritative Bukhari collection, No. 9.83.17) quotes the prophet as follows: “A Muslim . . . may not be killed except for three reasons: as punishment for murder, for adultery, or for apostasy.” It is true that the hadith says “may,” not “must,” and there is in fact some squabbling among sharia scholars about whether ostracism could be a sufficient sentence, at least if the apostasy is kept secret. Alas, the “may” hadith is not the prophet’s only directive on the matter. There is also No. 9.84.57: “Whoever changes his Islamic religion, then kill him.” That is fairly clear, wouldn’t you say? And as a result, mainstream Islamic scholarship holds that apostasy, certainly once it is publicly revealed, warrants the death penalty.

Having hailed the Afghan constitution as the start of a democratic tsunami, the startled Bush administration made all the predictable arguments against Abdul Rahman’s apostasy prosecution. Diplomats and nation-building enthusiasts pointed in panic at the vague, lofty language injected into the Afghan constitution to obscure Islamic law’s harsh reality — spoons full of sugar that had helped the sharia go down. The constitution assures religious freedom, Secretary of State Condoleezza Rice maintained. It cites the Universal Declaration of Human Rights and even specifies that non-Muslims are free to perform their religious rites.

Read the fine print. It actually qualifies that all purported guarantees of personal and religious liberty are subject to Islamic law and Afghanistan’s commitment to being an Islamic state. We were supposed to celebrate this, just as the State Department did, because Islam is the “religion of peace” whose principles are just like ours — that’s why it was so ready for democracy.

It wasn’t so. Sharia is very different from Western law, and it couldn’t care less what the Universal Declaration of Human Rights has to say on the matter of apostasy. Nor do the authoritative scholars at al-Azhar University in Cairo give a hoot that their straightforward interpretation of sharia’s apostasy principles upsets would-be Muslim reformers like Zuhdi Jasser. We may look at Dr. Jasser as a hero — I do — but at al-Azhar, the sharia scholars would point out that he is merely a doctor of medicine, not of Islamic jurisprudence.

The constitution that the State Department bragged about helping the new Afghan “democracy” draft established Islam as the state religion and installed sharia as a principal source of law. That constitution therefore fully supports the state killing of apostates. Case closed.

The purpose of real democracy, meaning Western republican democracy, is to promote individual liberty, the engine of human prosperity. No nation that establishes a state religion, installs its totalitarian legal code, and hence denies its citizens freedom of conscience, can ever be a democracy — no matter how many “free” elections it holds. Afghanistan is not a democracy. It is an Islamic sharia state.

To grasp this, one need only read the first three articles of its constitution [1]:

1. Afghanistan is an Islamic Republic, independent, unitary, and indivisible state.

2. The religion of the state of the Islamic Republic of Afghanistan is the sacred religion of Islam. Followers of other religions are free to exercise their faith and perform their religious rites within the limits of the provisions of law.

3. In Afghanistan, no law can be contrary to the beliefs and provisions of the sacred religion of Islam.

Need to hear more? The articles creating the Afghan judiciary make higher education in Islamic jurisprudence a sufficient qualification to sit on the Afghan Supreme Court. Judges are expressly required to take an oath, “In the name of Allah, the Merciful and Compassionate,” to “support justice and righteousness in accord with the provisions of the sacred religion of Islam.” When there is no provision of law that seems to control a controversy, Article 130 directs that decisions be in accordance with “the Hanafi jurisprudence” of sharia.

Moreover, consistent with the Muslim Brotherhood’s blueprint for society (highly influential in Sunni Islamic countries and consonant with the transnational-progressive bent of the State Department), the constitution obliges the Afghan government to “create a prosperous and progressive society based on social justice” (which, naturally, includes free universal health care). It commands that the Afghan flag be inscribed, “There is no God but Allah and Mohammed is His prophet, and Allah is Great [i.e., Allahu Akbar].” The state is instructed to “devise and implement a unified educational curriculum based on the provisions of the sacred religion of Islam” and to “develop the curriculum of religious subjects on the basis of the Islamic sects existing in Afghanistan.” In addition, the constitution requires the Afghan government to ensure that the family, “a fundamental unit of society,” is supported in the upbringing of children by “the elimination of traditions contrary to the principles of the sacred religion of Islam.” Those contrary traditions include Western Judeo-Christian principles.

Was that what you figured we were doing when you heard we were “promoting democracy”? Is that a mission you would have agreed to commit our armed forces to accomplish? Yet, that’s what we’re fighting for. The War On Terror hasn’t been about 9/11 for a very long time. You may think our troops are in Afghanistan to defeat al-Qaeda and the Taliban — that’s what you’re told every time somebody has the temerity to suggest that we should leave. Our commanders, however, have acknowledged that destroying the enemy is not our objective. In fact, Gen. Stanley McChrystal, the former top U.S. commander, said what is happening in Afghanistan is not even our war. “This conflict and country are [theirs] to win,” he wrote, “not mine.”[2]

It’s not our war, nor is it something those running it contemplate winning. “We are not trying to win this militarily,” the late Richard Holbrooke, President Obama’s special envoy to Afghanistan, told CNN’s Fareed Zakaria last fall.[3] Indeed, the administration had concluded — upon what Ambassador Holbrooke described as consultation with our military commanders — that the war could not be won “militarily.” So the goal now is not to defeat the Taliban but to entice them into taking a seat at the table — in the vain hope that if they buy into the political process they will refrain from confederating with the likes of al-Qaeda.

Afghanistan is not an American war anymore. It’s a political experiment: Can we lay the foundation for Islamic social justice, hang a “democracy” label on it, and convince Americans that we’ve won, that all the blood and treasure have been worth it? The same thing, by the way, has been done in Iraq. Ever since the Iraqis adopted their American-brokered constitution, Christians have left the country in droves [4], and homosexuals, similarly, have been persecuted.[5] And the Iraqis are so grateful for all the American lives and “investment” sacrificed on their behalf that, just this week, the capital city of Baghdad demanded that the U.S. apologize and fork up another $1 billion in reparations. For what? Why, for “the ugly and destructive way” the American army’s Humvees and fortifications have damaged the city’s aesthetics and infrastructure. Yes, a brief time-out from the usual serenity of life in a sharia state to chastise Americans for their “deliberate ignorance and carelessness about the simplest forms of public taste.”[6]

In 2006, promoters of Islamic democracy — having dreamed that this chimera was not merely plausible but a boon for U.S. security against terrorists — were stunned upon awakening to the reality of “democratic” Afghanistan’s intention to execute Abdul Rahman for apostasy. This was an “affront to civilization,” we at NR said at the time.[7] As Samuel Huntington explained, however, there are two senses of “civilization.” One assumes that all human beings, all cultures, are essentially the same and share the same concept of the higher form of life — that there is only one real civilization. The other holds that different cultures have very different ways of looking at the world — that there are several different civilizations, and what is an affront to one may be a convention to another.

The underlying premise of the democracy project is the former sense of “civilization.” As I argued at the time [8], the real world is the latter. And now, five years removed from the Abdul Rahman case, five more years of intensive, costly American entanglement with Afghanistan, Paul Marshall gives us the harrowing plight of Said Musa. When he told the Afghan court he was a Christian man, no Afghan defense lawyer would have anything to do with him — except the one who spat on him. He was thrown in jail as an apostate among 400 Afghan Muslims, and he has since been beaten, mocked, deprived of sleep, derisively referred to as “Jesus Christ,” and sexually abused. And just as no Afghan lawyer was willing to aid an apostate, the Afghan sharia state declined to aid him — refusing him access to foreign counsel. We think of this as an affront to civilization. They, on the other hand, think they have their own civilization, and that our civilization and Said Musa are affronts to it.

The affront here is our own betrayal of our own principles. The Islamic democracy project is not democratizing the Muslim world. It is degrading individual liberty by masquerading sharia, in its most draconian form, as democracy. The only worthy reason for dispatching our young men and women in uniform to Islamic countries is to destroy America’s enemies. Our armed forces are not agents of Islamic social justice, and stabilizing a sharia state so its children can learn to hate the West as much as their parents do is not a mission the American people would ever have endorsed. It is past time to end this failed experiment.

— Andrew C. McCarthy, a senior fellow at the National Review Institute, is the author, most recently, of The Grand Jihad: How Islam and the Left Sabotage America.