Thursday, February 18, 2016

‘Science’: Christian Bible More Bloodthirsty Than Quran


A new “data-based” study published on Yahoo NewsHuffington Post, and elsewhere purports to have proven that the Bible -- including the New Testament -- is more violent than the Quran.
From Tom McKay’s article about the study: “Fifty-eight percent of Americans have an unfavorable opinion of Islam” thanks to a “laundry list of misinformation about the faith's holy text, the Quran.” He continues:
But a recent project by data analyst and research marketer Tom Anderson turns one common misconception on its head: that the Quran is more consumed by blood thirst than the Christian Bible. … Of the three books [Old Testament, New Testament, Quran], the project found, the Old Testament is the most violent, with approximately 5.3% of the text referring to “destruction and killing” -- the Quran clocked in at just 2.1%, with the New Testament slightly higher at 2.8%.
According to Anderson, the findings challenge the popular notion among Westerners that Muslims subscribe to a particularly violent faith. Indeed, he concluded, “of the three texts, the content in the Old Testament appears to be the most violent.”
So this study suggests what Islam’s apologists have long claimed: that the Bible contains more violence and bloodshed than the Quran. That said, the intelligence and sincerity of anyone -- including supposed scholars -- citing this fact as proof that the Quran cannot incite violence more than the Bible is doubtful. For starters, this argument fundamentally ignores the context in which violence appears in all three scriptures.

Comparing violence in the Bible -- both Old and New Testaments -- with violence in the Quran conflates history with doctrine. The majority of violence in the Bible is recorded as history; a description of events. Conversely, the overwhelming majority of violence in the Quran is doctrinally significant. The Quran uses open-ended language to call on believers to commit acts of violence against non-Muslims. (See “Are Judaism and Christianity as Violent as Islam?” for my most comprehensive and documented treatment of this tired apologia.)

 This study also fails to consider who is behind the violence. It simply appears to count the number of times violent language appears. Due to this, New Testament descriptions of Christians -- including Christ -- being persecuted and killed are supposedly equally inciting to Christians as Allah’s commandments for Muslims to “slay the idolaters wherever you find them -- seize them, besiege them, and make ready to ambush them!” (Quran 9:5). This study sees no difference between the martyrdom of Stephen (Acts 7-8) and Allah’s words: “I will cast terror into the hearts of those who disbelieved, so strike [them] upon the necks and strike from them every fingertip” (Quran 8:12).

The claim behind this study -- that “fifty-eight percent of Americans have an unfavorable opinion of Islam” apparently because of “misinformation about the faith’s holy text, the Quran” -- is a strawman argument. “Islamophobia” is based less on what Americans think about the Quran and more on the violence, terrorism, and atrocities they see and hear Muslims commit in the name of Islam on a daily basis. Ironically, the whole point of appealing to a strawman argument is that the argument itself is ironclad, even if it doesn’t address the real issue. As seen here, however, even the straw argument itself -- that the Bible has more potential to incite violence than the Quran -- is full of holes.

This is to say nothing of the fact that Islamic teaching is hardly limited to the Quran. Volumes of canonical (sahih) Hadith (words and deeds of Muhammad) equally inform Muslim actions. As one Muslim cleric put it:
Much of Islam will remain mere abstract concepts without Hadith. We would never know how to pray, fast, pay zakah, or make pilgrimage without the illustration found in Hadith.
As it happens, calls to anti-infidel violence in the Hadith outnumber the Quran’s.
There are other problems with this study. For example, it doesn’t seem to take into consideration that the Bible is roughly ten times longer than the Quran. Due to the study's many shortcomings, even Anderson admits that his “analysis is superficial and the findings are by no means intended to be conclusive.” So why are several media outlets highlighting the conclusion of a study which readily admits it does not prove what its champions claim?

Apparently the politically correct conclusion -- that Islam cannot be any worse than Judaism and Christianity -- is all that matters here, gaping holes in methodology be damned.

Hurray for Tim Cook

By Kevin D. Williamson — February 17, 2016

Rough Road Sign
(Erin Willett)

A few days ago, I was bumping along a tooth-rattlingly rough stretch of interstate when I saw a sign: Rough Road. No kidding, Sparky. A mile or two of shake-rattle-’n’-roll later, another sign: Rough Road. You don’t say. Rocka-rocka-rocka-thumpa-thumpa-thumpRough Road. Sign after sign after sign: Rough Road.

You know what they could have done with all the time and energy and resources put into erecting those Rough Road signs? Maybe — here’s a crazy notion — put some new blacktop on that sorry lunar hellscape that Uncle Stupid calls I-10. But that’s government for you: “Not only do we refuse to do our job and maintain these roads despite a $40 billion a year budget for doing just that, we’re going to pay a gang of union-goon schmucks $40 an hour to erect signs advertising the length and breadth of the shaft we are giving you.”

With that in mind: Hurray for Tim Cook.

Tim Cook is the CEO of Apple, and Uncle Stupid is leaning on his company just at the moment, demanding that the firm create some specialized iPhone code — call it “FBiOS” — that will allow it to crack the mobile phone used by one of the San Bernardino terrorists. Which is to say, with all of the power and money and other resources we put into national security, law enforcement, and counterterrorism, the Men in Black cannot defeat some yahoo’s iPhone PIN.
This is what happens when you apply the Rough Road–sign model to fighting the war on terror. Yes, of course we’d like to have some prosecutions and convictions in the San Bernardino case, inasmuch as it is clear that the jihadists there did not act without some assistance. And, yes, there probably is some useful information to be had from that iPhone. But there is something deeply unseemly about a gigantic and gigantically powerful national-security apparatus’s being stymied by ordinary consumer electronics and then putting a gun to the head of Apple executives and demanding that they do Uncle Stupid’s job for him.

You know what would be better than prosecuting those who helped the San Bernardino jihadists? Stopping them, i.e., for the Men in Black to do their goddamned jobs. An arranged marriage to a Pakistani woman who spent years doing . . . something . . . in Saudi Arabia? Those two murderous misfits had more red flags on them than Bernie Sanders’s front yard on May Day, and the best minds in American law enforcement and intelligence did precisely squat to stop their rampage. Having failed to do its job, the federal government now seeks even more power — the power to compel Apple to write code rendering the security measures in its products useless — as a reward for its failure.

There’s an argument that we shouldn’t judge our counterterrorism efforts by their failures but by their successes — all the attacks that have been prevented that we don’t know about. There is a little something to that, but not very much. The Transportation Security Administration, for example, has perpetrated a great deal of thievery and contraband trafficking, but Der Gropenführer does not seem to have prevented a single act of terrorism in all its history. We spend hundreds of billions of dollars a year on intelligence, counterterrorism, and law enforcement. In some cases, we have given these guys a license to kill American citizens. With that kind of power and those kinds of resources, it is entirely appropriate that they be judged by their failures, of which San Bernardino is a spectacular example.

From the IRS to the ATF to the DEA to Hillary Rodham Clinton’s super-secret toilet e-mail server, the federal government has shown, time and again, that it cannot be trusted with any combination of power and sensitive information. Its usual range of official motion traces an arc from indifference through incompetence to malice.

Where the federal government imagines that it gets the power to order a private firm to write software to do its incompetent minions’ jobs for them is anybody’s guess. Tim Cook and Apple are right to raise the corporate middle finger to this nonsense. Cook says that the software the FBI demands is “too dangerous to create” given the risk that it could fall into “the wrong hands.”

Perhaps he is being polite, but the fact is that the FBI is the wrong hands. Its agents have leaked secret information in live investigations to their girlfriends, engaged in various and sundry episodes of extortion and blackmail, and used federal resources to check up on their favorite strippers. (Nobody got fired, of course. Nobody ever gets fired.) And of course, as in a great many federal offices, FBI supervisors spend a great deal of time watching pornography on their office computers and masturbating. That earned one supervisor a 35-day suspension. Is that how they do it in your office?

The more you think about what the hell it is the federal government actually does, the less important it seems. About 80 percent of its activity, as measured by cash flows, consists of simply transferring money from one group of Americans to others in the form of Social Security checks and subsidized medical benefits. Its senior leaders steadfastly refuse to do their jobs: The border goes unsecured, visa controls remain nonexistent in spite of a specific legal requirement that the government address this problem, the roads and other infrastructure under the federal umbrella of responsibility are a mess in spite of the trillions of dollars thrown at them in recent years, etc. And the federal government’s answer is: “Why won’t those mean meanies at Apple do our jobs for us? So what if that means rendering many of their products entirely worthless and betraying the trust of millions of customers?”

Maybe your experience is different. In my experience, what government actually does at every level is hassle me and take my money while failing to do the basic things that we constituted it to do. The borders are a joke, the roads crumbling, the schools a sty of corruption and miseducation, and the police, as a wise man once put it, are a janitorial service that takes your body away after the deed has been done. Perhaps it is appropriate that our next presidential election may very well pit a reality-television grotesque against an antediluvian Red from Brooklyn. American politics consists of an increasingly bitter and hate-fueled fight over an increasingly irrelevant institution. If Apple disappeared tomorrow, the world would notice. You can’t say the same about the TSA or the Small Business Administration, and it is not entirely clear that you could say much better about the FBI.

Rough Road? Indeed, it promises to be.

— Kevin D. Williamson is National Review’s roving correspondent.

Wednesday, February 17, 2016

Machete Jihad in Ohio

How the police and media have whitewashed the truth about the Columbus attack.

February 17, 2016

The FBI is investigating Mohamed Barry, who took a machete to patrons at an Ohio deli. He was shot and killed by police. Screen shot from NBC4

Last Thursday there was yet another jihad terror attack on American soil: a Muslim named Mohammad Barry entered the Nazareth Restaurant & Deli in Columbus, Ohio, screamed “Allahu akbar” and began slashing at patrons with a machete. Immediately, the denial and obfuscation machine cranked into high gear. Columbus police Sgt. Rich Weiner said: “Right now there’s nothing that leads us to believe that this is anything but a random attack.” The Washington Post wondered: “Random act or Islamist terrorism?” 
It’s actually abundantly clear that this was not a random attack, but an incidence of “Islamist terrorism,” and not just because Barry screamed “Allahu akbar” (which was not reported in mainstream media accounts about the attack).Barry was on the FBI’s radar for his jihadi views, although they dropped any investigation of him back in 2012 (why?). An initial report said that Barry “walked into the restaurant, had a conversation with an employee and then left,” only to return later with his machete. The owner of the restaurant, Hany Baransi, explained what the conversation was about: Barry “came in and asked where I was from.” An employee of the restaurant (Baransi was not in the restaurant at the time) told him that Baransi was from Israel.
Hany Baransi is indeed a Christian from Israel. A large Israel flag along with an American flag greets restaurant patrons at its entrance. “I am a very outspoken Israeli,” said Baransi, “and I have an Israeli flag in my restaurant. When people [from the Arab community] ask me where I am from, I tell them I am Israeli, I am an Israeli Christian Arab, it’s not like I am Palestinian, and then they start arguing and fighting with me.”
To sum up: a Muslim who was known to the FBI as having jihadi sentiments entered a restaurant owned by a proud and outspoken Israeli and prominently displaying an Israeli flag. He asked where the owner was from and then, having found out, left the restaurant and returned with a machete, which he used to slash patrons after screaming “Allahu akbar.” Yes, that’s a real head-scratcher. There is just no telling what could be going on here. Never mind that there is a global jihad being waged by Muslim groups all over the world who hate Israel, hate Christians, and tend to scream “Allahu akbar” as they commit murder. Sgt. Rich Weiner is determined not to recognize that. The Washington Post is determined not to recognize that. So to them, this can be nothing but a random act.
AP Photo/Kantele Franko
Hany Baransi knows better: “Is it a random attack? Yes, but it wasn’t a random attack like you’re walking in the street and there are 10 shops and you pick one,” he said. “It was a random attack [insofar] that I was one of the Israelis [picked] between all of the Israelis that are around here. It was a terrorist attack.”
The law enforcement and media refusal to face this obvious reality is entirely unsurprising: after every jihad terror attack now, it seems as if the first priority of the media and every official who says a thing about it is to make sure that no one gets a negative view of Islam because of the attack. It must be very nice for Islam to have so many eager PR agents, but it does create a problem for the rest of us: what if a jihad terror attack really does have something to do with Islam? What if the fact that Mohammad Barry screamed “Allahu akbar” is entirely relevant to the case as a revelation of his motives and goals? And aren’t those motives and goals useful to know so that authorities can take realistic steps to prevent this kind of thing from happening again?
I have said all this before and will no doubt say it again, unless the jihad gets around to me first. It is worthwhile to keep repeating it, but both law enforcement officials and the mainstream media appear to be absolutely unshakeable in their resolve to ignore, deny and obfuscate the true root causes of Islamic jihad. 
And so we are on our own. Hany Baransi shows us the attitude we should have in such a situation: “Actually I have another flag, and I am going to get a bigger flag, and I am going to get a Star of David necklace and put it on my chest, and I am going to get a tattoo. Honest to God, I am not kidding. They don’t scare me. We are Israelis. We are Israelis. We are resilient, we fight back.”
This is what we need: defiance and courage in the face of evil. Instead, the whole West is surrendering, and calling their surrender “respect.”
 Tags: AttackIslamIsraelJihadTerrorism

Today's Tune: Brian Fallon - A Wonderful Life


January 26, 2016
In the summer of 2014, when Noisey met up with Brian Fallon, the Gaslight Anthem singer was at the tail-end of an intense period of self-doubt. Just shy of the release of his band’s divisive fifth album, Get Hurt, Fallon was at last beginning to feel self-assured, confident even, in his artistic output. As he explained to us, the album was an undertaking he needed to endure so as to self-cleanse. Recently divorced and feeling as if he perhaps had become a one-trick pony, Fallon had been searching for meaning. On the album, he felt he had perhaps found it. “You have to go through it to get to the other side, man,” he said at the time of his personal demons. “You really do. I had to prove to myself that I was capable of more than just this one thing.”
Just shy of 18 months later, Fallon is once again at a crossroads: Gaslight Anthem is on hiatus following the critically-panned Get Hurt; and, more notably, Fallon is a few weeks shy of releasing his debut solo album. Painkillers, due on March 11 and produced by Butch Walker (Weezer, Taylor Swift), finds the New Jersey native reconciling his lifelong love of tried-and-true singer-songwriters and stripped-down, acoustic guitar-anchored folk music—something hardcore Gaslight fans might not be accustomed to. While he’s previously spread his wings with side projects including the more brooding Horrible Crowes (with guitarist Ian Perkins) and the Americana outfit, Molly and the Zombies, Fallon initially felt a strong hesitation to embark on a solo endeavor. “I was always dreading the concept in my mind,” he says, comparing a solo musician to a circus act flailing his arms in a desperate plea for attention. “So I had to convince myself.”
Once in the studio, though, Fallon says he felt a weight lifted off his shoulders. Namely, he had only himself to please. “That’s the awesome part,” he says. “At the end of the day, you can look at a record and say, “I like this and I would listen to it.” And if no one else does, at least you have that.” Such a need to live a life sans restrictions is explicitly referenced in the album’s opener and first single, “ A Wonderful Life”: “I want a life on fire, gone mad with desire/ I don’t want to survive, I want a wonderful life.” Later, on the fingerpicked “Nobody Wins,” Fallon moans, “I lost most of myself pleasing everyone/ I had to learn how to begin again.”
Despite what may appear to be a rough-and-tumble few years for Fallon, when Noisey rang up the singer on a recent morning, he was cheery, self-deprecating, and brutally honest in assessing his new album and career as a whole, how the critical reception to Get Hurt freed him to writePainkillers, and how he’s recently learned to better understand Bob Dylan’s perceptive on songwriting.
Noisey: You’ve been writing songs outside Gaslight Anthem for some time now, but when did this solo project become a reality?
Brian Fallon: 
I had the idea maybe two years ago to do it. I started writing songs and just had this extra batch of songs that didn’t quite feel like they fit with Gaslight. So I just put them to the side for a minute. I did that Molly and the Zombies project and we got invited to do this Home for the Holidays show that the Bouncing Souls do. They asked me to just do it myself and I was like, “Well, I don’t really have anything.” I had those couple songs so I thought, “Let me just put together a band and see what people think of it.”
And then some time passed by and some of those songs still hadn’t been recorded? 
When Gaslight picked up again, I kind of put everything to the side. Fast forward another year and a half and Gaslight were going to go on a break. I thought, “Better pull out those songs and see what’s going on with them!”
Has the idea of releasing a solo album always been in the back of your mind?
It was something that I always had mixed feelings about. The word “solo” record has always been cringeworthy to me. You know what I’m saying? ‘Cause in a way, you don’t want to be the guy at the circus like, “Look what I can do! I know this trick that I didn’t show you before.” It never ends up being a good trick. It’s always the same trick just with different stuff. Once I knew there was going to be time off from Gaslight, I knew I couldn’t sit still. I had to do something. I thought, “Well, what about having another band or continuing Molly and the Zombies or maybe continuing the Horrible Crowes?” And then a friend of mine said to me, “You have all these other bands but then they’re gone when you go back to Gaslight. Why don’t you just do it under your name because then you can just do whatever you want?” I told her my reservations but then she brought up Nick Cave and the Birthday Party and how it can be OK. I had to warm up to it by looking at other people who had done that I didn’t feel were doing themselves a disservice.
Who else comes to mind as an honorable band leader gone solo?
I always felt Conor Oberst was a good example because he had a bunch of projects but they all sounded different. You always could tell it was Conor because of his writing, but it was different. As for me, I was thinking my excuse was: There’s this singer-songwriter thing that I always really wanted to get to and then there’s the band thing that’s the singer-songwriter mixed with punk and whatever else. I wondered whether that purified singer-songwriter-type record was worth me going out on my own. And after I weighed it out, I thought it was.
You mentioned both Molly and the Zombies and Horrible Crowes. Is it important in your mind to compartmentalize each of your projects and keep them completely separate from each other?
Well, not all of them. I think it’s Gaslight and then everything else. That’s important for me. When the Horrible Crowes was going on, it was something I felt like I needed to do. But since I had a partner, I didn’t really feel like it was a solo record. Maybe that was just me excusing it. [Laughs] I can’t kid myself, though: I went and did a solo record so I can’t sit here and say that’s not cool. I dothink it’s cool. But then there’s still the other side of me that wants to be like, no, that’s not cool! It’s just that thing that everybody has, that back and forth. Any time as a musician you have any borderline musical moral ground, you struggle internally. You see it with some of the biggest acts: Nirvana were on MTV yet they were constantly making fun of MTV. You can’t call anybody out on it though because we all do the same exact thing.
I think it gets back to the simple fact that even though musicians want to keep their audience in mind, they have to follow their intuition. 
I think you always write for yourself and then you dress it up for your audience a little bit. Everybody thinks about it. If anyone says they don’t think about their audience when they’re writing, either they’re totally on another plane, a higher plane then I know of, or they’re lying. [Laughs] Because they do! You don’t step out a stage to be like, “Well, I hope this just bombs! I hope everything goes horribly wrong and I end up in the poorhouse and bankrupt.”
You were working with several other musicians when recording this album but at the end of the day, particularly when it came to decisions regarding the direction of the album, everything fell on your shoulders. Was that an exciting prospect? Or perhaps a frightening one?
I did think it was weird at first. Gaslight is more democratic than you would think. We have this rule where we don’t do anything that someone hates. So if someone really hates something, we just don’t do it. The card is not pulled often but when it is, it’s like, OK, we’re not doing that. With this album everything was down to me essentially: “What do you want the guitars to sound like?” “How do you want to layer everything?” Even [producer] Butch [Walker] was asking me questions and a lot of time I would be like, “Well, what do you think, man? You’re the guy that’s doing it. You’ve got the good sound. What does so-and-so use?” I never had that decision-making process before and the freedom to choose exactly how I wanted something to sound. It was nice. I’m not gonna lie… I liked it. It was cool.
There’s no one else to please.
Yeah. That’s the awesome part. It really helps when there’s no compromise at the end of the day and you can go “I’m happy with this.” And then when you put it out, even if somebody says something weird about it you can go “Well, it’s what I like.”
The album definitely takes you in new musical directions. There are elements of American folk music, early rock ’n’ roll, and a heavy helping of acoustic guitar.
I think this has always been on the plate. Even since [Gaslight Anthem’s 2007 debut album] Sink or Swim, there’s always been an acoustic song or something like that on the records. Inside my head, it would be always there: This is the music you really, really like. When you’re alone, you don’t really listen to other stuff. You listen to this kind of stuff.
It was one of those artistic things where you just have to do it at some point. I saw the opportunity and thought it might as well be now because I don’t have anything else going on. It was pretty natural to sit down and go: “Never mind anything except for the song.” If something could stand on its own with just an acoustic guitar, if I could strip it back and play it by myself, then it’s going to be good. And then we’ll dress it up however it needs to be later. That was always something that’s always been a little bit dear to me. You have to learn, I think, to do that. And I don’t even know if I’ve learned enough to really get it right quite yet.
Does playing a song that bare make an artist more vulnerable?
I think some people really get it right and some people are just woodshedding. They’re in the shop but they’re not really building anything. There’s a fine line there and it’s hard to judge yourself as to when you’re truly ready. My favorite song in the world is “Don’t Think Twice, It’s Alright” by Bob Dylan. I read an interview one time where he said, “I don’t think I’m old enough to sing that song.” It struck me that he had already recorded that song when he said that. I thought, well, then why did you say that? But as I grew older, I began to get it. He wrote a song that was beyond his capability but he still tried it anyway.
That reminds me of a song on your album, “Honey Magnolia,” one with heavy country folk influences and pedal steel. I imagine you couldn’t—or perhaps wouldn’t—have written that as a younger musician. 
I don’t think I would have been able to orchestrate that together with the harmonies and everything. The funny thing with a song like “Honey Magnolia” is the demo sounds pretty much the same way the record sounds. From start to finish, it was already conceptualized from the beginning. That’s something I learned over time. The song is from a personal place but it’s sung from a woman’s point of view. It’s an odd thing. That’s something they did in folk music. They were always singing from other people’s voices and seeing things from other people’s perspective. You have to delicately do that. You have to create the world and drop little hints. I had to work at that to get it to sound simple but make sure the meaning came across.
In terms of pushing your boundaries on Painkillers, I’m curious how Gaslight’s most recent album, the sonically adventurous Get Hurt impacted you? Did it encourage you to continue to throw caution to the wind?
It’s weird. I felt the complete opposite. Some people were so cool about Get Hurt and I found a lot of kids at shows that really embraced it. But I feel like in another way, I got completely torn to shreds on that one. For the first time there was some things I definitely got skewered on. I was like “Whoa!” Somebody would send me something and I would be like “Why would you send that to me? That’s awful! Why did I read that?” I had to put down the computer for a second. So this was more of a reaction in the opposite way. I didn’t feel like Get Hurt freed me to do anything. Matter of fact, I felt like Get Hurt bound me a little bit. I almost felt like I got smacked for doing it.
The negative reviews stung?
It set me back a bit. I didn’t know what to do next. I thought, “I better really consider what I’m doing here and consider my options.” I had to do something that I believed in 100 percent. Because if I didn’t believe in it, and I got spanked again, it would have crushed me. It felt like a lot of the criticism wasn’t necessarily about the record, but about my sincerity as a person, which I found extremely weird. Why are these strangers criticizing me as a person when I’ve never met them?
People tend to assume musicians are immune to criticism. But at the end of the day, creative are some of the most sensitive people. 
Exactly! Or else we wouldn’t be doing what we do. Artists, writers, musicians, any people that are in the arts, as soon as you walk into that world, they should hand you two cards: “Welcome to the Hypocrite Club” and “Welcome to the Baby Club.” Because emotionally, we’re all babies. Really, I’m a little bit shy, which is totally stupid because then why do I put myself out there on a stage? I don’t know.
It seems though that fans have been reacting positively to your new solo material when you’ve been out playing it live in recent weeks.
I’m impressed with most crowds with their patience level. You’re playing ten or 12 new songs that they haven’t heard before and then you’re mixing them in with songs they do know from Horrible Crowes and Molly and the Zombies. But essentially, they’re coming out as true music fans and just listening. It’s been really good. I expected every night for people to be like, “Play that one song!” It made me realize that, aside from the internet, people are cool.
Dan Hyman also wants a wonderful life. Follow him on Twitter.

Solo album lets The Gaslight Anthem's Brian Fallon start over

The Gaslight Anthem singer Brian Fallon has revisited his past for a new album as a solo artist.

By Vanessa Franko
February 11, 2018
Brian Fallon, singer of the Gaslight Anthem, has a new solo album, “Painkillers,” due in March. He plays tonight at Pappy & Harriet's Pioneertown Palace.
Brian Fallon, singer of the Gaslight Anthem, has a new solo album, “Painkillers,” due in March. He plays tonight at Pappy & Harriet's Pioneertown Palace. (Danny Clinch)

With the Gaslight Anthem, Brian Fallon sold out venues around the world and shared the stage with icons including Eddie Vedder and Bruce Springsteen.
But by the end of 2014, he wasn’t happy about it.
“I was just mad at the world. I was mad at everything. I was mad at myself, mad at my situation. I was even mad about the band getting so big. I felt like we lost a little bit of touch with where things were going,” Fallon said in a recent telephone interview.
As the band’s members discussed a hiatus after wrapping the tour for 2014’s “Get Hurt” – a polarizing release inspired by Fallon’s divorce that broke out of the band’s Springsteen-meets-Bouncing-Souls punk shell – the singer had what he described as a “come to Jesus” with himself.
“‘You just need to start over. You need to take a break and you’re a little bit ungrateful, actually’ – that’s what I thought about myself,” he said. “‘You have all these people that come and see you and you have this successful band and you’re mad about everything. People would kill to be in your shoes.’”
Fallon resolved to embrace his good fortune, leave the anger and bitterness behind and look back to move forward.
“When you get in a wreck like that and you don’t really know where you’re at, I think the best place you can go is back to the beginning and start over,” Fallon said.
His rebirth is “Painkillers,” an intimate, honest portrait produced by Butch Walker and due March 11. Fallon starts a string of Southern California dates Friday, Feb. 12, at Pappy & Harriet’s in Pionertown, with Jonny Two Bags (the alt-country alter ego of Social Distortion guitarist Johnny Wickersham) opening.
Fallon’s next sonic step is influenced in a large part by reflecting upon his past. Growing up in New Jersey, the 36-year-old’s earliest musical memories were hymns his mother played around their home, among them redemption songs “Amazing Grace” and “Swing Low, Sweet Chariot.”
As Fallon grew older, she introduced him to folk music. He learned about Bob Dylan, Leonard Cohen and Bruce Springsteen, which led him to rock.
While Fallon has had other side projects (The Horrible Crowes, Molly and the Zombies), the solo record is a culmination of what he had wanted to do for years: craft a singer/songwriter record. Not that the concept was a radical left turn – going back to Gaslight Anthem’s debut, 2007’s “Sink or Swim,” the group’s albums included an acoustic track or two.
So between February and September 2015, Fallon went to work. He started at 9 a.m. daily, and his bedroom was his office, where he wrote about 20 songs on a Martin D-41 acoustic guitar. If a song wasn’t coming together, he would leave it for the next day.
Coming from a scenario where the members of the Gaslight Anthem bounced ideas around to mold a song into its best form, Fallon said, calling all of the shots was both liberating and nerve-wracking.
Similarly, the unknown reaction of fans to the music is one of the things that has excited him as a performer. He debuted a lot of new material during a recent run on the East Coast.
“You could tell from note one that those people came there to listen. They didn’t come there to shout along. They didn’t come there to party or throw beers. They came there to listen to some music and that was clear from the moment I stepped on the stage. This is the raddest thing,” he said.
And that energy and intimacy have given Fallon a connection he had sorely missed from the old days in an up-and-coming band.
“I feel like I’m doing it for the first time again,” he said.
Contact the writer: or @vanessafranko on Twitter

Tuesday, February 16, 2016

Why Antonin Scalia was a jurist of colossal consequence

February 14, 2016
Antonin Scalia, who combined a zest for intellectual combat with a vast talent for friendship, was a Roman candle of sparkling jurisprudential theories leavened by acerbic witticisms. The serrated edges of his most passionate dissents sometimes strained the court’s comity and occasionally limited his ability to proclaim what the late Justice William Brennan called the most important word in the court’s lexicon: “Five.” Scalia was, however, one of the most formidable thinkers among the 112 justices who have served on the court, and he often dissented in the hope of shaping a future replete with majorities steeped in principles he honed while in the minority.
Those principles include textualism and originalism: A justice’s job is to construe the text of the Constitution or of statutes by discerning and accepting the original meaning the words had to those who ratified or wrote them. These principles of judicial modesty were embraced by a generation of conservatives who recoiled from what they considered the unprincipled creation of rights by results-oriented Supreme Court justices and other jurists pursuing their preferred policy outcomes.
Today, however, America’s most interesting and potentially consequential argument about governance is not between conservatives and progressives but among conservatives. It concerns the proper scope of the judicial supervision of democracy.
Scalia worried more than some other conservatives do about the “counter-majoritarian dilemma” supposedly posed by judicial review — the power of appointed justices to overturn the work of elected legislators. Many Scalia-style conservatives distill their admiration into a familiar phrase of praise: “judicial restraint.” Increasing numbers of conservatives, however, reason as follows:
Democracy’s drama derives from the tension between the natural rights of individuals and the constructed right of the majority to have its way. Natural rights are affirmed by the Declaration of Independence; majority rule, circumscribed and modulated, is constructed by the Constitution. But as the Goldwater Institute’s Timothy Sandefur argues, the Declaration is logically as well as chronologically prior to the Constitution. The latter enables majority rule. It is, however, the judiciary’s duty to prevent majorities from abridging natural rights. After all, it is for the securing of such rights, the Declaration declares, that “governments are instituted among men.”
Scalia’s death will enkindle a debate missing from this year’s presidential campaign, a debate discomfiting for some conservatives: Do they want a passive court that is deferential to legislative majorities and to presidents who claim untrammeled powers deriving from national majorities? Or do they want a court actively engaged in defending liberty’s borders against unjustified encroachments by majorities?
This is an overdue argument that conservatism is now prepared for because of Scalia’s elegant mind. He was crucial to the creation of an alternative intellectual infrastructure for conservative law students. The Federalist Society, founded in 1982, has leavened the often monochrome liberalism of law schools, and Scalia has been the jurisprudential lodestar for tens of thousands of students in society chapters coast to coast.
Students of the court understand that, given Sen. Harry Reid’s demonstrated disdain for Senate rules, if Republicans had not won Senate control in the 2014 elections, the Nevada Democrat as majority leader would very likely now extend the institutional vandalism he committed in 2013. Then he changed Senate rules, by a simple majority vote and in the middle of a session, to prevent filibusters of judicial nominees other than Supreme Court nominees. This enabled President Obama to pack the nation’s second-most important court, that of the U.S. Court of Appeals for the District of Columbia Circuit. Were Reid still majority leader, the Senate’s only rule would be the whim of the majority of the moment, and his caucus would promptly proscribe filibusters of Supreme Court nominees.
One consequence would be this: The United States today is one Supreme Court vote away from a radical truncation of the First Amendment’s protection of freedom of speech. A Democratic president in 2017 would nominate to replace Scalia someone pledged to construe the amendment as permitting Congress to regulate political campaign speech, which would put First Amendment jurisprudence on a slippery slope to regarding all speech as eligible for regulation by the administrative state.
Scalia lived 27 years after the person who nominated him left office, thereby extending the reach of Ronald Reagan’s presidency and reminding voters of the long-lasting ripples that radiate from their presidential choices. A teacher, wrote Henry Adams, attains a kind of immortality because one never knowswhere a teacher’s influence ends. Scalia, always a teacher, will live on in the law and in the lives of unnumbered generations who will write, teach and construe it.
Read more from George F. Will’s archive or follow him on Facebook.

Remembering Scalia

A monumental jurist and great American

By James R. Copland
February 15, 2016

With the passing of Antonin Scalia, the legal world lost a titan. Justice Scalia was the sixteenth justice in U.S. history to begin his thirtieth year of service, and his impact on the Court was vast. In jurisprudential philosophy (his legal textualism and originalism), in real constitutional impact (his fierce fidelity to the written constitution’s libertarian principles), and in legal craftsmanship (his remarkable writing), Scalia left a singular mark on American constitutional history.

Scalia was a quintessential American—and a quintessential New Yorker. He was the first Italian-American to serve on the Supreme Court: his father was an immigrant from Italy and his mother the child of Italian immigrants. Born in New Jersey, Scalia grew up in Queens; his father taught romance languages at Brooklyn College.

After starring at Xavier, an all-boys Jesuit high school in Manhattan, Scalia enrolled at the nation’s oldest Jesuit university, Georgetown, where he finished as class valedictorian. From there it was off to Harvard Law School, where he graduated magna cum laude. While at Harvard, he met Maureen McCarthy, whom he married. The Scalias had nine children, two of whom, Eugene and John, followed their father into the law. (Eugene, an acquaintance, is perhaps the highest-regarded administrative lawyer in the nation’s capital.)

Following law school, Scalia worked in private practice before going on to teach law at the University of Virginia and the University of Chicago, with an interregnum as assistant attorney general in the Office of Legal Counsel for the Nixon and Ford administrations. In 1982, Scalia turned down President Reagan’s offer of appointment to Chicago’s Seventh Circuit Court of Appeals, angling instead for the influential D.C. Circuit, to which the president appointed him later that year. In 1986, after Associate Justice William Rehnquist was elevated to chief justice, Reagan nominated Scalia to fill Rehnquist’s seat on the Supreme Court. The Senate confirmed Scalia’s appointment 98-0, which is remarkable in hindsight: little more than a year later, the Senate rejected the president’s appointment of Robert Bork in a fierce battle that set off the modern confirmation process.

If Bork brought “originalism” into the public discussion through his confirmation hearings, Justice Scalia brought it to life on the Supreme Court. In 1997, Scalia gave the Manhattan Institute’s annual Walter B. Wriston lecture. (Other speakers have included sociologist James Q. Wilson, economists Milton Friedman and Thomas Sowell, author Tom Wolfe, playwright David Mamet, businessman Rupert Murdoch, Secretary of State Condoleezza Rice, and—following Scalia—two of his fellow justices on the Supreme Court.) In his 1997 talk, Scalia remarked, “I am now something of a dodo bird among jurists and legal scholars. You can fire a cannon in the faculty lounge of any major law school in the country and not strike an originalist.” That may still be true on law campuses, but in 2010, Scalia’s future colleague Elena Kagan told the Senate in her confirmation hearing, “we are all originalists.”

Scalia’s philosophy of law was that words matter. In interpreting statutes, that meant that judges should look to the words legislators enacted rather than trying to discern their intent. If ambiguities arise, as they inevitably will, judges should turn to neutral principles—time-honored judicial canons—to figure them out. Scalia’s last book, Reading Law: The Interpretation of Legal Texts, published in 2012 and coauthored withBlack’s Law Dictionary editor Bryan Garner, made just this point. In interpreting the Constitution, Scalia’s originalism meant that judges should look to the document’s text and try to apply its often open-ended language according to their original public meaning—what the words meant at the time the provisions were adopted. At its root, this theory holds that unelected judges should not overturn the will of elected representatives absent a clear constitutional mandate.

Thus, though he was an observant Catholic with libertarian leanings, Scalia endeavored never to impose his will over that of the people. If the Constitution was silent on an issue, then the Court should leave it to the people and their elected representatives to decide. Scalia’s pointed dissents in cases involving abortion or homosexuality may have won plaudits in the Vatican, but he was no less fervent in arguing that the constitution’s Eighth Amendment prohibition on “cruel and unusual punishments” could not possibly apply to the death penalty, given the document’s contemporaneous invocation of “capital crimes.”

Where the Constitution spoke, however, Justice Scalia vigorously enforced its limitations on the states and the other branches of the government. In one of his earliest Supreme Court opinions, he wrote for a narrow five-justice majority in Nollan v. California Coastal Commission, in which the Court turned back, under the Fifth Amendment’s Takings Clause, the state’s attempt to condition a building permit on a property easement. The decision would presage Justice Scalia’s landmark opinion five years later inLucas v. South Carolina Coastal Council, which forms the basis for the Court’s modern position on regulatory takings. The Takings Clause, which protects private property owners from having their land unduly seized by the government, remains a staple conservative issue—as seen in the GOP primary debates, where rival candidates have attacked Donald Trump’s commandeering of state and local eminent-domain powers to acquire property for his development projects.

In 2000, in Apprendi v. New Jersey, a case overturning a judicial sentence imposed on a criminal defendant based on facts that the jury never considered, Scalia wrote a concurrence invoking the Sixth Amendment right to a jury trial, in which he pointedly observed that said right “has never been efficient; but it has always been free.” This opinion laid the groundwork for his monumental opinion for the Court four years later in Blakely v. Washington, in which another five-justice majority—including Scalia’s fellow conservative Justice Clarence Thomas but also liberals Ruth Bader Ginsburg, John Paul Stevens, and David Souter—overturned the application of the state’s mandatory sentencing guidelines on the same rationale. Writing in dissent, Justice Sandra Day O’Connor rightly regarded the decision as “a Number 10 earthquake.” In short order, after a follow-on case, the federal sentencing guidelines were merely “advisory.”

Justice Scalia was an equally vigorous defender of the First and Second Amendments. He was of course in the five-justice majority affirming rights to political speech in Citizens United v. FEC (2010), but his defense of First Amendment free-speech rights was consistent and broad: he joined Justice William Brennan’s 1989 opinion in Texas v. Johnson, which overturned as unconstitutional flag-desecration statutes, and he wrote for the Court in Brown v. Entertainment Merchants Association (2005), which struck down a California law banning the sale of violent video games. And Scalia revivified the Second Amendment with his 2008 opinion for the Court in District of Columbia v. Heller, which struck down a handgun ban in the nation’s capital.

Scalia’s impact on our constitutional jurisprudence is thus hard to overstate. Much as the three-decade period of liberal constitutional expansion from the mid-fifties through the mid-eighties—with the Warren and Burger Courts—were dominated by the brilliant civil libertarian Brennan, so Scalia was the driving force behind the last three decades’ Rehnquist and Roberts Courts.
Beyond doctrine and decisions, Scalia will best be remembered for his writing, which is among the best in the Court’s history. In his introduction of Justice Scalia for the Wriston Lecture, my colleague Peter Huber observed that “he writes with the strength and passion of the great dissenters of the Court’s history, in the noble tradition of John Marshall Harlan and Oliver Wendell Holmes.” In a 1993 article in the Harvard Journal of Law & Public Policy, former U.S. solicitor general and Harvard law professor Charles Fried described Scalia’s “natural talent” for writing as of “the kind which distinguishes a Mozart from a Salieri.” Huber’s and Fried’s observations have now been justified empirically. In a 2014 computer analysis of judicial opinions’ vocabulary, University of Chicago law professors Adam Chilton and Eric Posner discovered that Scalia’s word usage surpassed not only all other then-current justices but also the opinions of great former justices on the Court—save the noted belletrist Holmes. Scalia’s vocabulary even came out “ahead” of Shakespeare’s.

When writing for Court majorities, Scalia had an atypical literary flair. In his opinion in Entertainment Merchants Association, rejecting the state’s notion that its video-game regulation should be upheld because it was intended to protect children, Scalia observed:
California’s argument would fare better if there were a longstanding tradition in this country of specially restricting children’s access to depictions of violence, but there is none. Certainly the books we give children to read—or read to them when they are younger—contain no shortage of gore. Grimm’s Fairy Tales, for example, are grim indeed. As her just deserts for trying to poison Snow White, the wicked queen is made to dance in red hot slippers “till she fell dead on the floor, a sad example of envy and jealousy.” Cinderella’s evil stepsisters have their eyes pecked out by doves. And Hansel and Gretel (children!) kill their captor by baking her in an oven.
But Scalia’s true mastery lay in the dissent. Objecting to the Court’s 1992 reaffirmation of abortion rights in Planned Parenthood v. Casey, Scalia noted, “The Imperial Judiciary lives.” In rejecting the Court’s decision to read “state” as “federal” in last summer’s Obamacare challenge, King v.Burwell, Scalia accused the majority of “interpretive jiggery-pokery” that was “pure applesauce.” Reacting to Justice Kennedy’s majority opinion recognizing a right to same-sex marriage in the Constitution, Scalia caustically proclaimed, “The Supreme Court of the United States has descended from the disciplined reasoning of John Marshall and Joseph Story to the mystical aphorisms of the fortune cookie.”

Belying such acerbic tone in written opinions, however, Scalia—Nino to his friends—was personally close with those colleagues with whom he most regularly sparred. With his New York roots, Scalia was unsurprisingly a longtime theater and opera aficionado, and he regularly attended with Ginsburg, who had been his friend dating back to their time on the D.C. Circuit. Since Elena Kagan joined the Court, he regularly took her hunting—a sport he enjoyed on Friday, his last day. A monumental jurist and a great New Yorker and American, Antonin Scalia will be deeply missed.