“Right-to-work” vs. The right to work

Wisconsin is a pen stroke away from becoming the nation’s twenty-fifth “right-to-work” state. A bill bearing that Orwellian appellation will arrive on Scott Walker’s desk this Monday.

“Orwellian” because so-called right-to-work laws don’t establish employment as an affirmative right. Wisconsin’s Tea Party legislature hasn’t passed a job guarantee. Rather, the new law would provide workers in unionized shops the right to enjoy the higher wages that come with union representation, without having to pay the dues that sustain that representation.

The original right-to-work law was passed in 1947, under the less propagandistic title of “The Taft-Hartley Act.” A common misconception about contemporary “right-to-work” laws is that they prohibit unionized businesses from conditioning employment on union membership. In fact, that prohibition has been in place for nearly 70 years. Taft-Hartley ended the “closed shop” era of American labor.

But the law also required every union to provide all the benefits of union membership to the non-member employees of their shop. And these aren’t limited to contractual benefits secured through collective bargaining. As Forbes’ Rick Ungar explains, the union is also required to provide full legal representation for a non-member who alleges wrongful termination:

“So rock solid is this obligation that should the non-union member employee be displeased with the quality of the fight the union has put forth on his or her behalf, that non-union member has the right to sue the union for failing to prosecute as good a defense as would be expected by a wrongfully terminated union member.”

To compensate for this remarkable obligation, Taft-Hartley required these non-members to pay “agency fees,” which were defined as that portion of a union’s dues devoted to the costs of worker’s services, as opposed to the fraction invested in political action.

Walker’s law frees non-member employees of all obligations to their employer’s unions, while maintaining the obligations those unions owe to non-members. The result is a free-rider problem, and with it, diminishing union membership and political power.

Capital chases weak labor like lions chase arthritic gazelles. Or so the data most frequently cited by advocates of right-to-work seems to suggest. Between 2003 and 2013, the 24 right-to-work states added 2.1 million more jobs than the 26 others, and boasted a 12.3 percent greater increase in manufacturing GDP.

But there are costs to courting corporate investment through the cultivation of a pliant workforce. Right-to-work states have a higher concentration of low-wage jobs and lower median household income than other states. They also invest 31.3 percent less in education and experience a staggering 54.4 percent higher rate of workplace deaths.*

For decades, the Republican Party’s pitch to American workers has been that unions and government programs actually work against their own self-interest. By introducing uncertainty into the economy, these inefficient interlopers stymie the one true source of higher living standards: economic growth.

If America’s political system weren’t so thoroughly corrupted by the influence of moneyed donors, that pitch would need to be revised for 2016.

For the past four decades, American workers’ wages have failed to keep pace with their productivity. Since the year 2000, productivity has risen 23 percent while inflation-adjusted wages have essentially stagnated.

The financial crisis has made this divorce between workers’ productivity and wages more apparent. Virtually all the income gains produced by the last two years of economic growth have accrued to the one percent. Justin Wolfers of The Upshot finds that the average income of a one-percenter rose from $871,100 in 2009 to $968,000 by the end of 2013. That same period saw the average income of the remaining 99 percent of workers fall from $44,000 to $43,900.

The reality of elite economic domination has become so stark, even Larry Summers can see it.

A leading intellect behind the Democratic Party’s shift away from aggressive fiscal and regulatory policy, Summers now says that only government intervention can lift the struggling middle-class.

At a panel on “the future of work” hosted by the centrist Democratic think thank The Hamilton Project, Summers dismissed the idea that unemployment could be reduced through education and job training programs, arguing that the nation’s core economic problem is that:

            “there aren’t enough jobs, and if you help some people, you can help them get the jobs, but then someone else won’t get the jobs. And unless you’re doing things that are affecting the demand for jobs, you’re helping people win a race to get a finite number of jobs, and there are only so many of them.”

Summers argues that, rather than training workers for jobs that don’t exist, the government could more effectively reduce unemployment by simply paying people to do all the valuable work that the private sector lacks financial incentive to perform, like taking care of old people or repairing decaying bridges.

The Tea Party and the Teamsters may not agree on the economic ramifications of right-to-work laws, but there’s no controversy about their political consequences; by undermining unions, the laws undermine liberal political organizing and fundraising.

If Summers’ analysis is correct, and full employment can’t be achieved without increasing government investment, then the most vital function of organized labor for the American working class may be its political one. At a time when the top one percent own more than a third of the country, and a couple objectivist oil tycoons plan to spend nearly a billion dollars on our next election, workers need organized labor to counter the political dominance of organized capital.

The G.O.P. goes into 2016 seriously handicapped by its slavish commitment to the interests of America’s most sociopathic oligarchs. A Pew Poll released this week found that 62 percent of Americans believe “the economic system unfairly favors powerful interests.”

Whether Democrats are able to take advantage may depend on the capacity of an emaciated labor movement to counter the influence of the party’s own plutocratic (but pro-choice!) contingent.

On Monday, Scott Walker will make that movement a little less capable.

*All these statistics need to be taken with a dollop of soy sauce.

For one thing, right-to-work states proliferate through the south and Midwest, where wages have long lagged behind those of the more heavily industrialized northeast. Such states also boast lower land and living costs, which entice business investment irrespective of labor policy. The stronger growth of jobs and productivity in these states is less impressive when one considers the possibility that they’re merely catching up with their more urbanized peers.

Still, the data seems to broadly support the logical supposition that undermining unions weakens labor, which encourages capital investment, which creates a high number of low-paying jobs.

Advertisements

The Moral Logic of Harm Production

My last post argued that welfare dependence and drug-use are linked less by empirical evidence than by the just-world bias of conservative ideology.

But that link isn’t the only premise of Scott Walker’s plan to drug-test recipients of state aid which merits scrutiny.

Walker has said that denying drug users access to food stamps is “not a punitive measure” but a way of “getting people ready for work.” Thus, the plan is premised on the idea that by providing impoverished drug-users basic nutrition, the state enables them to remain intoxicated and unemployed.

This runs counter to the logic of the modern GOP’s most successful anti-poverty program.

One of the few bright spots for progressives in the legacy of Bush II, was a sharp decline in the nation’s rate of chronic homelessness. That decline was widely attributed to the administration’s adoption of a “housing first” policy.

For a longtime, the paradigm in homeless care was to target those behaviors that seemed to perpetuate the helplessness of the homeless individual. This was done by conditioning the provision of housing on enrollment in sobriety and job-training programs.

“Housing first” disrupted this model, suggesting that a more effective and affordable approach to reducing homelessness was to provide immediate, unconditional housing to the most dysfunctional sector of the homeless population. The theory’s proponents argued that it was cheaper to house these individuals than to keep them on the street, where they were wracking up irredeemable debts to state shelters, hospitals and jails. Further, they argued that once comfortably housed, these individuals would be more receptive to self-improvement programs offered as elective opportunities, rather than coercive obligations.

The second Bush administration adopted the approach in 2002. Between 2005 and 2007, chronic homelessness fell by an unprecedented 30 percent, and continued to decline through the recession, buoyed by the Obama administration’s $1.5 billion investment in “Homelessness Prevention and Rapid Re-Housing.”

Chronic homelessness and malnutrition are discrete deprivations. The fiscal costs imposed by the homeless are greater than those imposed by the hungry. Still, the success of “housing first” challenges the idea that the most effective and thrifty way for the state to reform dysfunctional poor people is to exacerbate their poverty.

But what makes Walker’s idea truly dangerous is that it’s immune to empirical challenge, because it proceeds from moral principle .

After being elected Milwaukee County Executive in 2002, one of the first fights Walker picked was against harm reduction approaches to drug policy. In his first budget, he eliminated $230 million in funding for a needle exchange program run by the AIDS Resource Center in Wisconsin. Defending his position against unanimous opposition from the county board’s Finance and Audit Committee, Walker said “I have a hard time believing that a majority of Milwaukee County taxpayers want their tax dollars going to pay to continue the habits of illegal drug users.”

Walker didn’t base his opposition on (non-existent) scientific data linking needle-exchange programs with increased rates of heroin use. He didn’t attempt to grapple with overwhelming evidence of the efficacy of needle-exchange programs in combating the spread of HIV. His opposition was premised on the same moral principle that compels him to deny food assistance to drug users: The state mustn’t inoculate individuals from the consequences of their illegal behavior.

That principle has broad political resonance and policy implications.

Facing an epidemic of heroin related deaths, legislators in Kentucky and Maine are trying to expand access to an overdose antidote called naloxone. They are butting up against other legislators who are compelled by Walker’s principle to oppose any measure that would reduce the incidence of heroin fatalities without reducing the incidence of use.

In a recent longform investigation, The Huffington Post showed how an iteration of this principle may be at the root of that very epidemic. In the piece, reporter Jason Cherkis documents how rehab facilities across the United States have resisted the medical consensus that replacement therapy via drugs like methadone and Suboxone is the most effective treatment for heroin addiction. These facilities are ideologically committed to an “abstinence-only” paradigm that can’t accommodate such findings.

Their paradigm is reinforced by that of many drug court judges, who prohibit addicts from pursuing recovery via methadone. The jurisprudence of Kenton County Judge Gregory Bartlett illustrates their ideology’s proud disinterest in medical science:

“Bartlett thinks one solution to the heroin epidemic might be a mandatory stint in a detox facility…But when it was suggested that detoxing without medication can lead to overdoses, Bartlett came up short. “I’ll take your word on that,” the judge replied. “I’m not an expert on what works and what doesn’t work.”

The cost of prizing ideology over “what works” can be measured in the lives of countless addicts who were desperate to get well. As Cherkis explains, abstinence-only treatment has a habit of killing those it doesn’t cure:

“A sober addict leaves a treatment program with the physical cravings still strong but his tolerance gone. Shooting the same amount of heroin the addict was used to before treatment can more easily lead to a fatal overdose.”

Following Cherkis’ report, the Obama administration announced that they would withhold federal funding from any drug court that denied addicts access to methadone.

It’s difficult to imagine President Walker putting public health ahead of moralism.

It’s a difficult to imagine President Walker.

On Scott Walker’s Unintelligent Designs for Welfare

Walker

Last Wednesday in London, a reporter asked Scott Walker if he believed in evolution. Walker replied, “I’m going to punt on that one,” and down every hallway of left-wing media, the “gaffe” alarms did blare.

But there’s actually nothing irrational about the Wisconsin Governor’s refusal to endorse rationalism.

A Pew report published in January found a full 55 percent of Americans believe in some form of intelligent design. That percentage is undoubtedly higher among the heavily evangelical Republican primary electorate, and undoubtedly lower among the highly-educated corporatists of the Republican establishment. Considering the central premise of Walker’s bid for his party’s nomination is his capacity to satisfy these disparate camps, it makes sense for him not to take sides on the origin of species.

But while Republican primary voters and donors are divided on biblical literalism, they are united by a faith in the infallible righteousness of our economic hierarchy. Among the tenets of this shared creed is the belief that America’s poor arrive at their deprivation through their own bad choices and/or the bad choices the social safety hammock seduces them into. And to this unscientific worldview, Walker has pledged his full support.

In his State of the State Address earlier this month, Governor Walker proposed a law requiring every applicant for unemployment benefits or food stamps in Wisconsin to pass a drug test. Twelve states have already instituted mandatory drug testing for those seeking benefits from Temporary Assistance for Needy Families (TANF), the program most synonymous with “welfare.” But even in the context of existing state laws, Walker’s proposal is extreme.

No state has attempted to impose mandatory drug testing on the beneficiaries of unemployment benefits or food stamps because they don’t actually have the authority to do so without federal approval. But Walker’s proposal doesn’t just pick a fight the Executive branch, it also runs afoul of the Judiciary.

In December of 2013, the 11th Circuit Court of Appeals ruled that a Florida law requiring all TANF applicants to submit to drug testing violated the Fourth Amendment’s prohibition against unreasonable searches. The Court argued that the state had failed to demonstrate a “more prevalent, unique or different drug problem among TANF applicants than in the general population.” Therefore, Florida lacked the reasonable suspicion necessary to search the urine of its most desperate citizens for narcotic residue.

The majority of states that drug test TANF applicants avoid this pitfall by only testing those they have independent reason for suspecting of drug use.

Thus, the constitutionality of Walker’s proposal is premised on the idea that if a Wisconsinite is in need of unemployment benefits or food assistance, it is reasonable for the state to suspect he has a drug problem. This idea has no more empirical support than that of a Universe created in seven days by an all-powerful, white-bearded misogynist.

This week, Think Progress reported that in Tennessee’s first six months of drug testing TANF applicants, .2 percent tested positive. Granted, Tennessee first screens its applicants with a questionnaire on past drug use and only tests those whose answers provoke suspicion. Still, within that carefully selected sample only 13 percent tested positive.

Utah invested $30,000 into a similar testing model to find that exactly 12 of its neediest citizens were verifiable drug users. Which were 11 more than Arizona’s program dug up in its first three years of existence. And during Florida’s short-lived experiment with indiscriminately drug testing TANF applicants, just 2 percent tested positive, which meant TANF applicants were 4 times less likely than average Floridians to have illicit substances in their bloodstreams.

It’s possible that mandating drug tests preempts some drug users from applying for benefits to begin with. But even if that were the case, there’s no reason to think the tests would be in anyone’s best interest.

For one thing, the drug most easily identified by these tests is marijuana, a substance that is legal for recreational use in four states, and demonstrably less harmful and habit-forming than alcohol. Starving impoverished children to punish their mothers for smoking joints is not a rational public policy.

Even in the case of problematic drug users, it’s not clear what withholding aid is meant to accomplish. Heroin addicts aren’t known for prioritizing nutrition over a fix when resources run low.

In Florida’s case, drug testing didn’t even serve the narrowest conception of the taxpayer’s self-interest; administering the tests cost the state $45,780 more than it saved in withheld benefits.

In other words, drug testing welfare recipients proved to be a costly government program that violated the constitutional rights of individuals, while benefiting no one save the bureaucrats and cronies who profited off its administration.

And yet 12 statehouses are currently considering similar legislation. Why does this policy command energetic conservative support despite mounting evidence of its inefficacy?

One answer is that wage stagnation is a bi-partisan phenomenon. Conservatives in the middle and working classes aren’t any better served by the status quo than their liberal peers.

Forbidden by faith from attributing their plight to an increasingly powerful and unaccountable economic elite, they blame their declining living standards on the turpitude of the drug-addicted poor.

In a saner world, it would be scandalous for a presidential candidate to endorse such superstitions.

On the Non-indictment In the Killing of Eric Garner

garner-protesters-block-west-side-highway-nyc

The following is a guest post by UWS_DAD43. The views expressed are those of its author and do not reflect the views of “The Guilt of a Liberal” blog or its imaginary advertisers.

The failure to indict Daniel Pantaleo in the choking death of Eric Garner has once again exposed the idea of a “post-racial America” for the lie that it is.

While traffic itself does not discriminate on the basis of color or creed, the uncomfortable truth is that the portion of New Yorkers who commute via automobile are predominately Caucasian. And so whenever an unarmed black man is killed with impunity, the weight of protester-induced traffic falls disproportionately on the backs of white and beige Americans.

The fact is, white people have been inconvenienced by the repercussions of violence against black people since the dawn of our nation. If you’re unfamiliar with this troubling aspect of our history, there’s an excellent PBS documentary on the subject called “The Civil War.

But for me, white inconvenience is not some abstract, academic subject. The experience of being an inconvenienced white man in America is one I live with every day.

I left my law office in Nolita on the night of November 25th with a spring in my step. After a long day of helping expropriators evade taxation, I was fifteen minutes away from my wife’s famous veal picatta, and the last three episodes of Mad Men’s fifth season (a superlative slice of premium cable that dramatizes the terrible inconveniences faced by white men in the advertising world of the 1950s and 60s).

It was not until I had merged onto the West Side Highway that I remembered: It had been less than 24 hours since a Ferguson Grand Jury had declined to indict Darren Wilson.

I would not arrive home that night until after 10 p.m.

The veal picatta had to be reheated, and was thereby rendered noticeably dry. Despite its propulsive narrative momentum, I was so exhausted by the end of a single episode of Mad Men that I quickly retired to my bedroom, made hasty love to my half-asleep wife, and passed out. It would be several days until I found out whether Joan’s big gamble would be enough to land the Jaguar account.

In the weeks since that awful night, some “friends” have said to me, “You should have checked the traffic report. If you had just done what the update advised, this never would have happened.” Such comments do not make me angry so much as sad: It is truly a pity that the propensity to blame victims for their own misfortunes is such a resilient feature of human psychology.

Others have suggested my experience couldn’t have been that bad. After all, was I not eager to catch up on the Serial podcast?

What those who have never been white and inconvenienced don’t understand is that even the battery of an iPhone 6 can sometimes die. It’s true that for the first two hours of my ordeal, I was able to make significant progress in Serial (a superlative podcast about the terrible inconveniences faced by white reporters trying to solve complicated mysteries). But for the last 45 minutes of my traffic jam there were no podcasts to be heard. I had become so reliant on my iPhone for in-car entertainment that I no longer kept a book of CDs in my BMW’s glove compartment.

With no diversion save satellite radio, I was left to stare at the multi-ethnic coterie of young people and their signs. Eventually, the psychedelic virtuosity of Coldplay was not enough to divert my attention, and I was left to contemplate my complicity in an exploitive economic system built on white supremacy.

Which is something that no American should ever have to do, regardless of their skin color.

In the coming weeks, the protests will likely thin. Gradually, we will think less and less of the disturbances of the past month. We will give ourselves over to the mundane rhythms of daily life.

Until it happens again.

And rest assured, it will happen again, so long as we indulge in such forgetting.

As the father of a young white man who will be 18 in January, I do not have the luxury to forget. Last week, I had a painful conversation with my son that few non-white parents will ever have to experience.

I told him never to leave the house with his iPhone uncharged, because the sad reality is that any time my son gets behind the wheel of a car, somewhere in this country, an unarmed black man could be killed for no reason, and my little Tyler could be confronted by an awareness of systemic injustice that I’ve spent my whole adult life trying to protect him from.

I will not be complacent while the oppression of black Americans continues to inconvenience my son, and other young men who look like him.

While I am not normally one to spend my free time on social media, it is critical that inconvenienced white people let their friends and families know that #WhiteDrivesMatter.

The day after my ordeal, I composed a Facebook status articulating many of these grievances. That status has since accrued over 20 likes, and sparked a lively debate in the comment thread beneath it. I will conclude with just a couple of the insightful observations my activism has already generated:

Screen Shot 2014-12-11 at 11.17.36 AM

Amen, April, amen.

Revealing Camouflage

                               ferguson

 

Above is a scene from Ferguson, Missouri in 2014. 

If you don’t know the context for this picture, and assume there must have been a minor terrorist attack in “fly-over” America that you somehow managed to miss, witness testimony suggests you’re right. Ferguson, Missouri has been terrorized.

For details on what the killing (/almost certain murder) of 18 year-old Michael Brown has wrought, Greg Howard’s write-up at The Concourse, “America is Not for Black People,” is an excellent primer.

Put in the simplest terms: Another unarmed black kid was executed by an agent of the state without charge or trial. This time in broad daylight, fleeing from police, with his arms raised above his head. This time the kid was college-bound. This time the local police will not reveal just how many bullets were pumped into his body. This time his community was reminded that neither the achievement of secondary education, nor the universal gesture of unthreatening submission can guarantee their sons exemption from sacrifice on the altar of white fear. This time non-violent protests insisting there could be no peace without justice gave way to riots.

And now cops dressed as soldiers stalk the streets with assault rifles blazing and when residents voice their displeasure from the confines of their own backyards, they are told over megaphone to return to their homes. When they reply that these are their homes, they are rebutted by tear gas.

Howard writes of the militarization of the American police:

“The worst part of outfitting our police officers as soldiers has been psychological. Give a man access to drones, tanks, and body armor, and he’ll reasonably think that his job isn’t simply to maintain peace, but to eradicate danger. Instead of protecting and serving, police are searching and destroying.”

It is possible to imagine arguments for the utility of domestic drones, tanks, and body armor. Not strong arguments, but at least logically coherent ones. We live in a country with exceptional rates of  gun ownership, where military grade assault rifles are a leisure item. We live in a world with poorly guarded nuclear stockpiles and nihilistic terrorist cells. But what is the purpose of paroling the streets of Ferguson, Missouri in outfits designed to blend into a Vietnamese jungle?

There is no utility to these uniforms. They are not worn for comfort.  They are a fashion statement. They tell the residents of Ferguson that the United States government is prepared to engage them on the same terms it engaged the indigenous of East Asia. They say that the moment their community fails to bear systemic oppression with stoicism, the moment one of their number breaks a storefront window, their suburb will no longer be treated as the home of democratic citizens but as a hotbed of insurgency.

In the most generous assessment, the police are only projecting this message because it was more affordable than buying traditional police uniforms, the bulk of these SWAT outfits being donated army surplus. But when small towns in Indiana can afford their own armored SWAT vehicles, the fact that municipalities across the nation feel it extravagant to spend grant money on a few extra police uniforms, conveys to community residents the same message as the camo itself: Your government is more concerned with maintaining the established order than even a semblance of democracy. 

Camouflage fatigues have never been so revealing.

 

 

Before They Were Famous: 5 Celebrities Shocking Former Lives

When we see our favorite stars lighting up the big screen or strutting down the red carpet, it’s easy to forget that before all the glitz and glamour, celebrities started out as regular single-celled organisms, struggling to metabolize energy in the primordial soup of ancient seas. For most, it took billions of years of hard work and reincarnation to get where they are now. And while some celebs took conventional paths to stardom, others made a few odd detours along the way. We promise, once you read our list of celebrities’ shocking former lives, you’ll never look at these icons the same way again:

1. Helen Hunt

helennn

In 2000, Helen Hunt had Mel Gibson wondering What Women Want. But in the late Cretaceous period, all Helen wanted was to grasp the head of her mate between her spiked forelegs and crush his skull with her mandibles. Feeling her partner’s flesh become nutrients inside her, as his hot fertile sperm gushed over her egg sack, Helen experienced a sense of well-being that she would have described as As Good as it Gets.

 

2. Taye Diggs

taye1

Long before he was helping Stella Get Her Groove Back, Taye Diggs was a middle-aged fishermen living off the coast of Southern China in the last days of the Ming Dynasty. Living on a junk ship made from softwoods, Diggs (or “Li Wan” as he was known then) lived a simple life, netting enough crabs and codfish to support a wife, and three children. One day in the summer of his thirty-second year, Taye Diggs’ ship was overtaken by Portuguese raiders, who broke his arms and abducted his eldest daughter. For years after, he would hear her terrified screams almost nightly.

In one dream, Diggs would bolt from his bed to the ship’s bow, where he’d look down to find his first-born child gasping and flailing in the moonlit water. Taye Diggs would reach down to grab her hands, but as soon as he got hold of her, she’d become heavy as stone, dragging him down into the water, down and down, to parts of the sea the moonlight never reached. In another dream, he’d discover her in the hull of a raider’s ship, her limbs tied to bedposts, and he would stab the swarthy man looming over her until the whole world turned to blood. But even in his wildest dreams, Diggs never imagined that he would one day star as Dr. Sam Bennet in ABC’s Private Practice.

 

3. Jon Bon Jovi

jbj copy

In 1986, Jon Bon Jovi may have been “Living on a Prayer,” but in 1896, he was living as a Yorkshire terrier in a clothing mill in Northern England. There, young Jon distinguished himself as an excellent hunter of rats, earning the esteem of the mill’s Scottish laborers, who kept him fed through a harsh winter that claimed four of his siblings.

That January, a 19 year-old worker named Thomas Murdoch had his hand crushed in a spinning mule. Murdoch spent most of the following afternoon seated on the floor of the mill, running his remaining fingers through Jon Bon Jovi’s matted fur, while weeping at the pulsing pain in his left hand. Through choked sobs, the boy whispered that his life was now worth less than a dog’s. Bon Jovi found the whimpering noises unsettling in an abstract way, but it was so nice to have a warm hand scratching gently at a belly filled with rats.

 

4. Lana Del Rey

lana

A lot of people know that Lana Del Rey released her first EP in 2008, under the name “Lizzy Grant.” But did you know she also used to be a field of corn?

For the first three decades of the 20th century, Lana’s spirit was reborn every spring through 100 acres of bright green stalks. Back then, the death-obsessed songstress felt no anxiety about her own mortality, though like all corn, she was deeply aware of it. Del Rey found solace in the notion of eternal return, in the understanding that each fall’s withering was but one phase in a cycle that promised spring’s infinite recurrence.

The Howard family had lived on Del Rey’s farm for over a century. In 1932, Mary Howard began to suffer the symptoms of a degenerative muscle disease for which there was no cure. Much like Lana, Mary found solace in her protracted withering by understanding herself as one link in a chain of generations. She watched her sons grow into men and imagined the simple joys of their rural childhood reproduced through an endless succession of future children. In October of ‘36, when her first granddaughter was dead of malnutrition and there were still no signs of rain, Mary convinced her sons to provide her a fatal dose of laudanum. Months earlier, as her last living roots coughed up dust, Lana Del Rey conceived the title of her debut album.

 

5. Kevin James

kj

Kevin James was born on the floor of a forest in Northern Spain, in the last decade of the 18th century. His mother was a 24 year-old Basque woman with a cleft lip and no husband. She lived in a small house with her widowed stepfather, who’d been drunk every day since he he’d raped her. For all her terrified sorrow and exhaustion, there was also the relief of one fewer unspeakable reality to carry with her, as she walked away from the dirt where he lay wailing.

Back then, leaves scratched Kevin James’ wet body as he shivered. Sunlight stung his eyes, birdsong and the rustling of tree branches vibrated against his eardrums. Two hundred years before his breakout appearance on Everybody Loves Raymond, Kevin’s world was an inferno of unintelligible information. All of which receded as his thirst and hunger grew, over four days that felt like centuries, until he found himself somehow above his own convulsing body, rising to a point outside of Spain and space and time, from where he understood the horror of his life as the horror of all life, as the terrifying vulnerability of being a creature with a stomach and a nervous system and a conscious mind shaped by the cruel combination of a desire for significance and awareness of mortality, alone at the hollow center of the self, in the silence that follows another failed one-liner, a creature who waits tables and donates sperm and keeps trying only because making people laugh helps him to forget the things that happy people can’t remember, who sweats bullets in Jay Leno’s green room while whispering observations about how god damn fat he is, who watches his sitcom debut at a bar in his hometown of Mineola surrounded by friends and family who make him feel like the one that Everybody Loves, a creature who stops for the young blonde woman from the E-Channel outside the premier of Paul Blart: Mall Cop, savoring the warmth of an April evening, of his wife’s hand against his back, of living the kind of life that other people die for, as his tiny heart stopped beating, Kevin James felt himself dissipate in waves of unexpected gratitude and said a silent prayer for every fly writhing on a web, every canary coughing in a coal mine, every failed novelist clicking refresh again and again on a porn clip that only ever buffers, a prayer he’d echo centuries later, telling ‘E’ that he was proof that anyone could make it, as long as they believed.

 

 

Ayaan Hirsi Ali and Her Holy Warriors

ali hannity

Last week, liberal fascists claimed another victim, as students at Brandeis University compelled their school to withdraw an offer of an honorary degree to Ayaan Hirsi Ali. Ali was born in Somalia, where at the age of five, she was forced to undergo female genital mutilation.  She was granted asylum from the Somalian Civil War by the Netherlands in 1994. She later served in the Dutch parliament, until revelations that she’d actually been residing in Kenya at the time of her asylum request, forced her resignation.

She wrote the screenplay to Theo Van Gogh’s 2004 film Submission, which juxtaposed passages from the Qu’ran with images of an Islamic woman being abused. The film got Van Gogh assassinated by a Muslim extremist, and sent Ali into hiding. She now runs a foundation that aims to protect American Muslim women from abuse by their religion’s traditionalists.

The change.org petition that cost Ali her honorary Bachelors, acknowledges the legitimacy of her grievances with Islam, but condemns the “hate speech” through which she expresses them. The petition quotes her as saying:

“Violence is inherent in Islam – it’s a destructive, nihilistic cult of death. It legitimates murder…the battle against terrorism will ultimately be lost unless we realise that it’s not just with extremist elements within Islam, but the ideology of Islam itself….”

Ali told Reason Magazine in 2007, “There are Muslims who are passive, who don’t all follow the rules of Islam, but there’s really only one Islam, defined as submission to the will of God. There’s nothing moderate about it.”

Ironically, Bill Kristol’s piece accusing Brandeis of applying an outrageous double-standard that allows for hateful criticism of Judaism, but not a fair critique of Islam, exemplifies the very opposite hypocrisy. Kristol notes that the University bestowed an honorary degree on playwright Tony Kushner in 2006, despite the fact that Kushner had “called the creation of Israel as a Jewish state ‘a mistake” and attacked Israel for ethnic cleansing.” He also noted that Brandeis had seen fit to provide Desmond Tutu an honorary degree, despite Tutu having characterized the “Jewish lobby,” as “too powerful.”

Tellingly, Kristol does not excerpt and explicitly defend Ali’s comments. Likely because placing Ali’s condemnations of Islam as “a nihilistic death cult”, next to Kushner and Tutu’s critiques of AIPAC and Zionism, would reveal how categorically different the two sets of statements are.

Yet it’s not just Bill “I grind Iraqi bones to make my bread” Kristol who obscures the violence of Ali’s rhetoric by equating it to Kushner’s. Even the sporadically reasonable Andrew Sullivan complains of Brandeis’ supposed hypocrisy:

“Kushner was challenging his own ethnic group just as powerfully as Hirsi Ali is challenging her own. But here is the question: why is he lionized and Hirsi Ali disinvited? Why are provocative ideas on the “right” less legitimate than provocative ideas on the left?”

If our national discourse didn’t privilege the voices of religious Jews and Christians, while vilifying those of Muslims, it would be obvious that no political bias is required to distinguish Kushner’s statements from Ali’s.

Whatever one’s opinion on the necessity of a Jewish state, it is a fact that a portion of the Jewish community has been opposed to state Zionism for centuries. Whatever one’s feelings on Israel, it is a fact, confirmed even in the work of Zionist historians like Benny Morris, that hundreds of thousands of Palestinians were forced from their homes by Israeli soldiers in 1948. Thus Kushner’s statements align him with a minority position in the Jewish community, and assert an historical fact. Ali’s statements assert that no form of Islam deserves our tolerance, because inherent to the religion is a violent fascism that must be defeated. Far from equivalent, the statements are hardly comparable.

Ali argues that Islam is incompatible with pluralistic democracy, because inherent to its ideology is an absolute “submission to god’s will.” And it is certainly possible to observe the pathology of the faith’s emphasis on submission to authority, in the acts of misogyny and violence perpetrated by its most extreme adherents. But it seems to me as simple to argue that Judaism is fundamentally immoderate, because inherent to the religion is an idea of ethnic supremacy, its ideology built around the notion of a “chosen people,” an in-group defined by matrilineal bloodline, endowed by God with a higher moral status than all others. One could confirm such a view by pointing to the treatment of Palestinians in the occupied territories, or that of African migrants seeking refuge in Israel.

This claim is, of course, absurd. Great numbers of Jews find in their faith an obligation to honor the moral worth of all human beings. The same is true for a great many Muslims.

I am not arguing that in our present moment, fundamentalist Judaism presents as great a threat to egalitarian democracy as fundamentalist Islam. Rather, I submit that every ideology, whether religious or secular, is capable of inspiring violence and oppression. It’s plausible that Islam is especially vulnerable to such interpretation, but not that it is uniquely so.

In her past rhetoric, Ali exceptionalizes Islam, defining the faith by the violence it inspires. According to the petition, when an interviewer noted the role Christianity had played in bringing about social progress, exemplified by its role in the American abolition movement, and then asked Ali whether Islam could ever inspire such change, she replied, “Only if Islam is defeated.” When asked if she meant radical Islam, she responded, “No. Islam, period.”

By defining every sect of the world’s most popular faith as an enemy of progress that must be destroyed, Ali expresses precisely the sort of violent, absolutist ideology she seeks to condemn.

In his book “I Don’t Believe in Atheists,” Chris Hedges writes, “The danger is not Islam or Christianity or any other religion. It is the human heart—the capacity we all have for evil. All human institutions with a lust for power give their utopian visions divine sanction.”

Hedges’ book warns that in positing religion as the one true source of human atrocity, new atheists like Sam Harris “externalize evil.” And once evil is no longer understood as a tendency inside all of us, which must be stifled through scrupulous self-reflection, it becomes a quality peculiar to them, whom we must destroy by any means. This externalization of evil is what allowed Harris to defend torture as a tool for the protection of human rights, and Hitchens to defend the illegal invasion of Iraq as a way of protecting a liberal world order.

Ali’s rhetoric articulates the logic that allows our government to routinely violate the civil rights of its Muslim citizens, and to name every brown boy killed in a drone strike an enemy combatant. She argues that because some of the Qu’ran’s rhetoric legitimates violence, everything associated with the book is poisoned. It seems only fair then for her opponents to apply the same logic to Ali herself. Because when neoconservatives like Bill Kristol and John Podhertz defend Ali, they aren’t defending the prerogative to call a violent ideology what it is. They are defending the sanctity of their own jihad.