Signal and Noise: A Response to the Newtown Tragedy

SignalNoise

As the nation continues to reel from the news and aftershock of yet another mass shooting, many of us are wrestling with our own thoughts and emotions. We review the facts and details as they become available, and from there we try to create a rational explanation for a monstrous and irrational act. It’s a fool’s errand, really, like trying to negotiate with a madman, and yet we make the attempts. And when we fail, as reasonable people are most likely to do, we look for ways out of the conundrum.

Mostly, we focus on ways to fit this communal tragedy into our own personal understanding of the world, however short-sighted or narrow-minded that understanding may be. In the libraries of our minds, we try to shelve this latest tale of terror under psychology or religion or politics or feminism. “Experts” from all of these disciplines have appeared on a range of talk shows and news broadcasts lately, variously depicting the shooter as a godless gun nut, an insecure young white male, an autistic videogame addict.

I’m normally drawn to literature for explorations of human madness, but this time around, I’m finding more appropriate parallels and metaphors in music, specifically in the science of sound. Maybe this is due to the title of Nate Silver’s recent book, The Signal and the Noise, which looks to mathematics and statistics as they relate to pundits and would-be prognosticators. More specifically, Silver discusses attempts to predict future events, such as elections, and wonders why most of these attempts fail.

At this moment, here’s the future-oriented question most of us struggle to answer definitively: How can we keep something like the Newtown killing from ever happening again?

Such a complex question inevitably inspires a great deal of “noise,” the static and chatter that surrounds and distorts a sensible discussion despite its tangential relevance. For example, we’ve heard many conversations about “mental illness” despite the lack of any confirmed diagnosis in the case of the Newtown shooter. Others have attributed his motives to celebrity-seeking, despite any evidence to that effect, and thereby condemned the media. One person ironically hijacked the persona of a media celebrity (Morgan Freeman) in order to make his own anti-media point more popular in the world of social media.

All of this is noise. When we listen to it and perpetuate it, we lose track of the signal: the clear message buried inside.

Signal and noise have been core components of electronic music for decades now. I grew up during a time in which music was transformed by the appearance of the synthesizer. These electronic devices generate a source tone (for example, discrete sounds like sine waves or random signals like white noise) that is subsequently processed through a series of oscillators, filters, and envelopes to create a final note or sound (helicopter rotors for a Pink Floyd album, for example). The resulting signal can then be amplified and heard through headphones or stereo speakers.

The science of synthesizers may be complicated, but I want to focus on two concepts related to signal and noise: filters and amplifiers.

The goal of a filter is to block and withhold unwanted material, such as a particular range of sound. An equalizing filter, for example, allows you to lessen (or conversely boost) the amount of treble or bass you might hear through your stereo system. Some physical filters, such as a coffee filter, are quite refined and prevent small particles from passing through. In a weird way, a bulletproof vest is a crude sort of filter; its function is to block the passage of ammunition through to the wearer.

Throughout our lives, we develop multiple filters to help us process and understand the world around us. Otherwise, we might feel completely overwhelmed 24/7/365—as if some of us don’t feel that way already. Some of these filters come prepackaged—in academic or religious instruction, for example. We construct and employ others through our own experience, the lessons learned in life. Like the bulletproof vest, these filters shield and protect us; they give us the strength and courage to venture into potentially dangerous situations, both physically and mentally. They help us face up to monstrosities.

At this point, perhaps a line of poetry is in order:

The heart knows no filter.

If you’re like me, the initial news of the Newtown massacre came as a shock to the system. My first responses were all raw emotion: grief, rage, fear. Heartache, of course, because the heart, not being bulletproof, was wounded.

I might add disbelief to the list, but I can’t. For starters, it would be a lie; Columbine and Aurora and all the previous mass shootings should have already prepared us for this. On top of that, disbelief is a secondary, filtered response. The prefix “dis” serves as the filter, processing the initial state of belief. When we said we couldn’t believe the news of the shooting, what we were really saying was that we didn’t want to believe. The filter acted to block the truth, to deny its passage and place in our world.

Pressing further, I would argue that, due to the intensity of our emotional responses, many of us were rendered temporarily insane by news of the Newtown shooting. This would explain our disbelief, our denial of reality, our futile grasp at adjectives such as “unthinkable,” “unimaginable,” and “inconceivable.” By the very definition of insanity, we “lacked reasonable thought.”

And so began the noise, the procession of unreasonable responses to the tragedy. Despite a nearly total lack of evidence, analysts began offering possible motives and explanations for the violence, many of them tailored to fit their particular areas of expertise or concern. Conversations about gun control gave way to discussions of mental illness and health care, all based on conjecture. One viral essay, a blog post from the mother of a difficult and sometimes violent child, claimed such a deep level of understanding that she equated herself with the shooter’s mother. It was an absurd reduction based on an unknown and complex situation. Such extreme filtration of the facts resulted in one of the most highly illogical and self-centered responses to the shooting, and yet it continues to dominate some discussions today.

Many of these filtered responses were amplified via the media and social networking sites like Facebook and Twitter. Immediately after the event, many of us reached out to family members and friends via phone calls, e-mails, and status updates and expressed our deepest, most sincere thoughts and emotions. Subsequently, we began to pass along the thoughts and feelings of others, mostly those that corresponded closely to our own beliefs. In other words, we filtered out the rest. We began to process the signal and, in some of the more extreme cases, distorted and completely lost it.

I thank our President Barack Obama for providing such a relatively measured and balanced response in the wake of this tragedy. In doing so, he reminded me of the kind of leader the country needs at a time like this, someone more like Mr. Spock than the Incredible Hulk. He did not indulge the vigilante superhero fantasies that preoccupy so much of the American mindset and perhaps contribute to this sort of violence in the first place. (I urge you to read Stephen Pinker’s book The Better Angels of Our Nature: Why Violence Has Declined for a much more thorough exploration of this.)

As Nate Silver reminds us, logic is a tough, impartial judge. Like the rules of law, facts and statistics are indifferent to your or my personal opinions or beliefs. As filters go, logical analysis is our most reliable option, difficult though it may be. That’s why, in the wake of extreme tragedies like Newtown, we must continue to educate one another and ourselves in the facts of the matter. It’s also why we need to place a renewed emphasis on logic in the education of our children. No tool will be more useful to them in the course of their lives.

With that in mind and by way of example, I offer some attempts at a logical analysis of various reactions to the Newtown massacre below, keeping in mind the question: How do we keep something like this from ever happening again?

We need more, not fewer weapons. Pro-gun people often claim that an armed teacher or principal (or mall shopkeeper, movie theater attendant, member of the clergy, bystander…all of us, really, I guess) could take down hostile trespassers and prevent deaths in situations like this. Studies have shown otherwise. A more fully armed populace might prevent deaths, but it wouldn’t prevent the situations themselves, so this is at best only a partial solution to the question above. It might also have the opposite effect of increasing the number of such incidents, since more weapons would be in circulation and potentially available to would-be shooters.

Let’s say we pass a law (and some politicians have proposed such laws) that mandates weapons for school officials and teachers (and, by extension, shopkeepers and movie theater attendants and so on and so on). By that reasoning, the willingness to carry a weapon and undergo extensive training in its use would become a prerequisite for anyone applying for these jobs. You would need to feel ready and able to shoot and kill another human being when called upon. Pacifists, whether on religious or just plain moral grounds, no longer need apply, at least not until all the discrimination lawsuits have been settled.

Also, if we truly want to prevent any future bloodshed by arming ourselves even more, we have to trust in our ability to act fast and first—probably without time to fully assess a situation, and certainly without time to attempt a diplomatic or talk-down resolution—which means responding with immediate gunfire to any threat, real or perceived. This is the rationale behind the “stand your ground” laws in some states, which, as we have seen in the case of Trayvon Martin, can sometimes result in the death of innocent children.

These pro-gun arguments also conveniently ignore (filter out) the fact that more and more of today’s mass shooters wear bulletproof vests or combat armor on their sprees. Unless we’ve all been trained to be reliable sharpshooters, our chances at taking down an armed maniac before he takes us down are therefore slim. Shooting at him becomes more like kicking at a hornet’s nest.

In a “good-versus-evil” shooting match, there’s also a high probability that innocent bystanders will be wounded and/or killed by “friendly fire,” especially in dark or smoke-filled spaces such as the Aurora movie theater. I suppose that in order for all of us to feel completely safe from potential gunfire, we should be wearing bulletproof armor at all times. Consequently, our country begins to sound less and less like “the land of the free and the home of the brave.”

We need to restore God in the classroom.  For starters, this argument always seems to focus first on the victims, not the shooter himself. Secondly, it baits an argument about whose God we should all bow down to. Thirdly, this argument conveniently neglects (i.e. filters out) the large number of violent historical events, some described in gruesome detail in the Christian Bible, in which God supposedly commands others to kill on his behalf. These people, some of them respected figures in religious history, dutifully obeyed the voices they heard in their heads. Chew on this: What if the all-knowing God spoke to the Newtown gunman and told him, “There is one and only one way to wake up Americans to the need for gun control. If you go into a kindergarten and kill dozens of small children, I guarantee you that gun laws in your country will change and that thousands more lives will subsequently be saved.” Religious philosophers, discuss.

 We need to prevent the mentally ill from owning guns. For starters, the buyer and owner of the guns used in the Newtown killing was the man’s mother, and she wasn’t mentally ill. By all accounts, her son wasn’t officially diagnosed as “mentally ill,” either. Therefore, this restriction would not have prevented the slaughter in Newtown.

Restrictions that focus on mental illnesses also seem to assume that such conditions are both evident and permanent, that they manifest themselves from birth and remain consistently visible throughout a person’s lifetime. Anyone who has the slightest inkling of knowledge about psychology knows this is hogwash. Do you feel the same amount of mental stability each and every hour of each and every day? Did you feel 100% mentally stable when you first heard the news about the Newtown killing?

If we feel that we must prevent the “mentally ill” from owning guns, then we must establish an “acceptable” level of illness when it comes to owning guns—that level at which a person is at risk of harming either the self or another living creature. One could argue that nearly everyone feels capable of harming the self or another living creature at some point in his or her life, either during a case of severe depression or in a moment of stress-related rage. With that in mind, the “no guns for the mentally ill” proposal could mean, in its most preventative application, no guns for anyone.

Speaking of acceptable levels of mental illness, let’s consider all those people who heard about Newtown and felt compelled to rush out to buy more guns. Some felt sure that they needed these weapons to protect themselves and/or their loved ones from future incidents. Others were afraid that their personal gun rights were about to be stripped away, so it was best to stock up now and hope for a grandfather clause that would protect their stash. Such extreme and self-centered responses to this national tragedy suggest a form of delusional paranoia, itself a mental illness, yet these are the supposedly reasonable folks intent on protecting us all from the irrational gunmen.

Here’s another inconvenient fact related to mental health: In many gun-related incidents (though not, it seems, Newtown), a major factor in the shooting is the consumption of alcohol. If we expand our scope to consider all alcohol-related fatalities, we find many, many more deaths of innocent people and children caused by drunk drivers each year. Grouping these deaths together sends a clear signal; spacing them apart renders them more like noise. That may explain why we see few people either suggesting or rallying around a proposal to ban alcohol in order to prevent deaths, either by gun or by car. Another argument for another time, perhaps.

Stricter background checks will prevent these killings. This is an argument of hindsight, which, as we all know, is 20/20. The purchase of a gun will always precede a criminal’s first armed robbery, rendering a background check ineffective. What we really need is a foreground check. Absent psychic powers of prognostication or the development of software that aggregates all your personal data and predicts whether or not you’ll become a mass murderer, we’re still a few miles shy of the finish line with this proposed solution.

Ban assault weapons. This one makes sense, though once again it’s a partial solution. Even if it can’t stop shooters from doing harm, it can at least limit the amount of damage done. In other words, it saves lives.

Think of it this way: an assault weapon is an amplified version of a single-shot gun. It makes a weak signal stronger. It allows one lonely, unstable voice to send out a disproportionately loud message. Even though that message may be unclear or irrational, it echoes throughout the culture, filtered and amplified by our own voices as we try to make sense of it, voices that are in turn amplified by the many forms of media at our disposal.

What we’re left with is chaos and cacophony. In the world of music, it’s called feedback.

Put more simply, it’s noise.

Advertisement

Unimaginable

The 9/11 Memorial always reminded me of a double Bat-signal rising up above Gotham City. (Photo from United States Government Works, public domain)

New York City, September 11, 2001.

After an invigorating morning swim and a walk to the office under clear blue skies, I’m selecting a muffin to go with my morning coffee when a cafeteria worker rushes in from the dining area. Nearly hyperventilating, he tells everyone that a plane has just crashed into one of the World Trade Towers.

We all hustle toward the windows and look to the south. Dense smoke billows out from the irregular shape now punched into the side of the north tower. We grasp at explanations. Our collective imaginations reach this consensus: the pilot of a small plane must have suffered a heart attack and lost control of his aircraft.

Shortly thereafter, in my own department eleven stories higher, I join my coworkers along the south-facing bank of windows. Speculation continues even as a second aircraft appears to the west, coming in low across the Hudson River at high velocity. We theorize that it must be a press-related plane rushing to the scene, but it’s coming in too fast and is too close to the building and, seconds later, slams into the south tower. A fireball erupts as it crashes through the building. The visual force of it shoves us back from the window.

After our many attempts to grasp what has been happening, we now face a more likely yet discomforting fact: We are under attack.

Throughout the morning, as we watch television broadcasts, monitor the Internet, contact friends and relatives, and eventually recoil in horror as the towers collapse one after the other, two dominant yet contradictory themes emerge:

“This is unimaginable.”

“It looks like a scene from a movie.”

Later that afternoon, in our attempts to navigate home through the paralyzed city, my husband and I crowd onto a bus heading uptown, away from the disaster. People sit and stand in silence, some marked as close witnesses by dust, soot, and ash. If we speak at all, we speak in broken whispers. Mostly, we settle into a state of numbness that will haunt us for days, weeks, generations.

When the bus stops to let passengers on and off, I look back toward the plumes of smoke rising above Manhattan. Already I’m exhausted by a sight that will dominate the view from my office window for a period beyond all expectation. I look down at the street, where I see a crumpled and dusty costume being run over repeatedly by buses, cabs, fire trucks, and police cars. I recognize the red-and-blue paneling and its black web motif. It is a Spiderman suit.

Where were the superheroes today? I ask myself.

The answer is obvious. Superheroes are make-believe. They’re fictions, fantasies. Yet somehow, even in the context of such stark realities, they seem entirely imaginable.

• • •

This is unimaginable.

It looks like a scene from a movie.

These comments echoed once again throughout the coverage of the shooting at the recent “Batman” movie premiere in Aurora, Colorado. In this instance, however, the comic-book costume in question was inhabited, come to life.

James Holmes, his hair dyed red like the villain the Joker, a gas mask concealing whatever crazed expression might have been on his face, his body encased in bulletproof armor, fired randomly and repeatedly into a smoke-filled theater and struck 70 people.

In Aurora, the police were quick to apprehend Holmes. With a mixture of luck and skill, they were also able to prevent any further damage that might have resulted from entering his booby-trapped apartment.

Even after capture, Holmes acted like someone “playing a role,” according to one police officer. In other words, he continued to engage in make-believe, pretending to be someone he wasn’t. He was acting out a fantasy.

He was imagining things.

• • •

At an early age, I decided that I could and would make a career out of imagining things. I read the fiction of Ray Bradbury, Italo Calvino, and Donald Barthelme with both appreciation and envy. In order to achieve true and lasting greatness, I would have to dream bigger things than my literary idols and spin from those dreams far greater works of fiction. To do that, I would have to create characters, settings, and situations that no one else had thought of before.

In short, I would have to imagine the unimaginable.

And so, after years spent crafting short stories, novels, and poems, the events of 9/11 left me feeling uneasy and strangely complicit. Hadn’t I created similarly sinister plots for my own antagonists? Truth be told, I saw nothing “unimaginable” about flying an airplane into a building as an act of terrorism. Quite the opposite: It seemed rather simplistic.

In the days and weeks after the attack, I grew annoyed when people in both the media and our government used the word “unimaginable” to describe the events. It seemed a sloppy description, as much a dodge as a denial.

What I found truly unimaginable was that the people in charge of defending against such things hadn’t done their jobs and anticipated this scenario. (In fact, some intelligence agents had, but few people believed enough in their reports to act on them.) I wanted to suggest that the soon-to-be-formed Department of Homeland Security should employ fiction writers like Tom Clancy and Robert Ludlum—any writers, really—to help them out a bit in the imagination department.

The Aurora shooting incident, however, represents the flip side of this argument. Here, a work of imaginative fiction inspired the assailant. In his twisted state of delusion, James Holmes latched onto the Joker, a character created by comic-book writers in the 1940s, and erased the border between fantasy and reality. He obsessed over the Batman mythology but followed the wrong role model.

Authorities made easy comparisons between Holmes’s behavior and Heath Ledger’s portrayal of the Joker in the film “The Dark Knight,” which earned him a posthumous Academy Award. Critics praised the depth of Ledger’s commitment to the role. Sources on the movie set expressed their astonishment at the intensity of Ledger’s investment in the character. In an interview at the time, Ledger himself described the role as “fun,” adding: “There are no real boundaries to what the Joker would say or do. Nothing intimidates him, and everything is a big joke.” (Empire magazine, January 2008)

“Well, I warned him.” According to one report (New York Daily News, January 24, 2008), this was Jack Nicholson’s response on learning about Heath Ledger’s death by overdose a short while later. (Nicholson had taken on the Joker role in Tim Burton’s 1989 film “Batman.”) Nicholson knew the emotional and psychological toll involved in playing the Joker, and Ledger’s subsequent health issues suggested that he had been haunted by the role well after the film had wrapped.

Ledger’s death raises uncomfortable questions for those in the creative arts. Are there some places in the imagination that we shouldn’t go? And even if there are, how do we prevent ourselves from going there? Just because we may refuse to imagine something, does that make it unimaginable?

• • •

“With great power there must also come–great responsibility!”

This insight, currently attributed to Stan Lee and the Spiderman franchise but thought to originate with the French thinker Voltaire, provides a guiding light whenever I have my doubts about writing and the imaginative impulse. It reminds me that any ability, like any tool, can be used for good or bad, to create or destroy. A person can use a hammer to build a home or break a window. Radioactive material can be used to destroy cancerous cells or bring an empire to its knees.

With so much potentially at stake, it becomes tempting to sit out the debates that we need to have—with each other and with ourselves—in the wake of events such as 9/11 and Aurora and countless other acts of violence. In postponing action—or, in some cases, shrugging and offering up a feeble, defeatist “Well, what can you do?” attitude—many of our appointed leaders deny their own power and, as a result, shirk their responsibilities. They fail us, plain and simple.

In the resulting void, opportunists rise to the debate like sharks to bloody chum. Some amplify their pre-existing fears and hatreds, which in turn can incite other acts of violence. (Witness the recent shooting in Wisconsin.) Others cultivate a false sense of security or superiority with cynicism or trendy ironic detachment, a kind of toxic snark. Neither suffices for the level of serious discourse required. Both, in their own ways, risk being irresponsible.

The day before the Aurora incident, I began work on a fictional story about guns and violence inspired by Ishmael Beah’s book Long Way Home: Memoirs of a Boy Soldier. The day after the shooting, my progress stalled. I could imagine what might happen next in the story, but part of me simply didn’t want to any more. The subject felt too overwhelming, and my efforts to address it seemed insignificant. Like young Peter Parker after his uncle’s murder in the Spiderman origin story, I questioned my own powers and abilities. That superhero suit? An empty costume after all.

Slowly, the paralysis subsided. Silence was no longer an option, nor was surrendering to self-doubt or cynicism. I began to write again, and, in doing so, attempted to create the empathic connection that explores all sides of an issue, not just the easy or most comfortable side.

I could not expect a superhero to swoop in and write that next difficult scene for me, just as we cannot leave our hopes for a more just and peaceful world in the hands of some mythical caped crusader. In the real world, our daily challenges remain ours and ours alone. Luckily, we have the power to respond to those challenges in creative and positive ways. If we can imagine ways to channel that power responsibly, then together we can save the world.

Dark Feed

 

 

In honor of Earth Day and National Poetry Month, here’s a change of pace, a poem about where we might find ourselves if the 24/7 news cycle shut down.

 

Dark Feed

 

A snap of light, a blight of static,

and the dimming plasma screen stares back

at them dumbly in their darkened den.

 

Perhaps some small animal has crawled

into the basement and gnawed through the cable

to render them disconnected this early spring night.

TV, wifi, phone: all dead. How will they deal

with this deprivation, this break in their continuous

loop of news and commentary?

 

They sit for a moment in silence, worry about

the freshness of batteries in flashlights, the locations

of candles and where they might find

a book of matches.

 

She comments on the recent string

of unseasonable storms.

He reminds her of last week’s solar flares.

Together, they circumnavigate what they falsely call

unimaginables: the terrorist plots and hacker conspiracies,

the evolving world they struggle to ignore.

 

In the dark outside, the crickets rub

their legs together and maintain their trill.

From that prompt she recalls

a bit of country wisdom

rediscovered in an almanac,

a formula to determine the temperature outside.

 

They count and do the math in their heads:

chirps per 14 seconds plus 40—

75, 75, 76, 77.

 

 

Book Value

When I was young, I borrowed most of the books I read from the local library. My public high school provided dog-eared copies of the classics in my college-prep literature classes. To further satisfy my growing reading appetite, I would ride my bicycle downtown to a local hardware store that kept a few ramshackle shelves of coverless remaindered paperbacks near the front door. For forty cents each (or three for a buck), I could pedal home with a combination of Mark Twain’s Adventures of Huckleberry Finn, Ray Bradbury’s Something Wicked This Way Comes, and the not-quite-latest collection of greatest Mad magazine gags (stupid answers to stupid questions, anyone?).

To this day, I continue to haunt second-hand bookshops, yard sales, and the remainder bins at big-box stores in search of bargains. At the end of the semiannual book sale at our local library, I came home with two brown paper shopping bags full of wonderful finds, including the collected poems of Stanley Kunitz, Anne Lamott’s Traveling Mercies, Chang-Rae Lee’s The Surrendered, Tab Hunter’s Confidential, and a couple of titles by Bill O’Reilly (stupid answers to stupid questions, anyone?). I am not exaggerating when I tell you that the bookcases in which those books now reside cost substantially more than the books themselves.

In each one of the instances noted above, my reading experience provided absolutely no compensation to the authors of the books themselves.

This is why I wince a little every time fellow lovers of literature rhapsodize about the beauty and allure of used bookshops. I cringe a bit when poets, novelists, and journalists extol the value and necessity of public libraries. I wince and cringe with the heart of a co-conspirator, for I, too, value and support these institutions. And yet by patronizing both of them, readers rob the arts to a greater extent than purchases made at the Amazons and Barnes & Nobles of the world. Access to free, remaindered, or used books may boost readership, but each one deprives writers of compensation for their work. Seen another way, they literally lower the value of the written word.

I learned at an early age that writers should expect little, if anything, in the way of recognition or remuneration for their efforts. Mostly one should expect rejection—either in the form of “thank you, but not for us” letters from potential publishers or “nice try, but not for me” comments from readers. Many literary writers still subscribe to the somewhat romantic “starving artist” archetype, envisioning themselves in tiny, unheated lofts subsisting entirely on grilled cheese sandwiches, flat soda, and watered-down tomato soup as they churn out page after page of unappreciated brilliance.

One might expect that things would have improved over the past few decades, but they haven’t, at least not in the publishing industry. In fact, the situation has become even more psychologically unhealthy. These days, many magazines and journals require hopeful writers to pay a “reading fee” for the opportunity to be considered for publication. In a number of instances, specious contests have replaced the normal submission process. The literary world has become more like a lottery in more ways than one. What was once called the “table of contents” in some books and journals might now be better referred to as a “list of winners.”

Editors and publishers will tell you that reading fees and contest are necessary evils in order to meet their financial demands. After all, there are bills to pay: editorial staff, office assistants, designers, printers, distributors, and so on. On top of that, one must pay for electricity, water, heat, telephone, and so on.

Now consider that for many literary journals, the person who provides the actual content, the writer, is often paid very little, if at all. Despite having potentially invested in college tuition, workshop or residency costs, and all those reading/entry fees, publication rarely means “hitting the jackpot.” A poem that took fifteen to twenty hours to write might net ten dollars. A historical piece that required months of research, writing, and revising might reward you with three or four copies of the obscure journal that accepted it.

With that in mind, you might wonder why anyone in his or her right mind would pursue a literary career in the first place. Perhaps this explains why so many writers place their full and earnest faith in the metaphorical value of writing. In this scenario, writers are akin to mystics who channel universal truths onto the page. This probably explains why some writers’ conferences can seem like cults to non-bookish outsiders—and why some attendees can come across as desperate martyrs. I’m reminded of a lighthearted quip a fellow writer once told me on sharing his latest piece: “I suffer for my art; now you can suffer, too.”

I worry that this kind of attitude—this self-indulgent elevation of “literature”—further devalues the entire creative enterprise. It fosters a disconnect between the writer on high and the lowly reader, which in turn leads to dwindling book sales and meager showings at readings. Many of us who remain committed to the written word nonetheless felt a twinge of recognition and understanding when someone posted on Facebook: “April is National Poetry Month, that time of year when we should all force ourselves to attend dozens of readings and pretend to be enjoying ourselves.”

So you see, in some ways I’m still a terrible supporter of the arts. I read the latest articles about the financial struggles in the publishing world and can’t escape the sinking feeling that, in some ways, we brought this on ourselves. For decades, writers have accepted disproportionate compensation for their work without much complaint. As readers, we became more and more accustomed to paying little—if anything—for the works we’ve consumed. And in a bizarre twist of hypocrisy, we voiced our strongest support for those institutions that hurt writers the most. With that in mind, the crumbling nature of the publishing world today shouldn’t surprise anyone.

Still, I hold onto a shred of optimism about the future. Here are a few final thoughts and suggestions, and I welcome others:

  • Don’t be deluded—and I mean that literally, not figuratively. Stop buying into harmful myths like the “starving artist” I mentioned earlier. Demand that writers be paid for their work, and if it has to be a small amount, at least ensure that it’s not an insulting or demeaning amount. If a magazine or journal can come up with a business plan that covers the costs of printing and production, it should also account for the value of the content itself—the author’s compensation.
  • Do what you can to help publishers meet these increased financial challenges by subscribing to those magazines and journals whose work you respect and admire. (I know, I know: Once again, it’s up to the writers to cough up more money in the hopes of supporting their own potential paychecks. Maybe you can get friends and family members to subscribe as well, or convince some rich benefactor to bequeath millions to the journal in question.)
  • Support your local library not only by checking out books but by getting out your checkbook. That way they can purchase more books and journal subscriptions.
  •  Realize that free public readings aren’t really “free” and support both the presenter and the host bookstore by purchasing book(s).
  •  Enjoy your bargain book from the remainder bin, but support the writer with a full-price purchase of another title if you liked his or her work.
  •  Most of all, value what you do, whether you read or write. If literature does indeed have worth, then we should all be ready and willing to pay for it.

Stronger

Preface

No, this piece is not about Kelly Clarkson’s current Top 40 hit, “Stronger,” though I’m gladdened by its success, especially because of Clarkson’s popularity among gay, lesbian, bisexual, transgendered, and questioning youth. This essay is about my father and me, how our relationship changed over time, and about the prevailing power of love.

But if that all sounds rather sugary and sweet, then maybe we should go back to that Kelly Clarkson song for a moment and track down the source of its anthemic chorus, “What doesn’t kill you makes you stronger.” Kids and pop culture connoisseurs of America, welcome to the world of Friederich Nietzsche. Yes, the German philosopher who introduced us to the dark concept of “nihilism” provided the original inspiration for Clarkson’s current hit with this quotation: “That which does not kill us makes us stronger.” Before Nietzsche went insane toward the end of his life, he penned some of the most revolutionary remarks in modern philosophy, including this one: “To live is to suffer, to survive is to find some meaning in the suffering.”

I’ve been meditating on those two statements after reading Sabrina Rubin Erdely’s astonishing “School of Hate: One Town’s War on Gay Teens” in Rolling Stone Issue 1150 (Feb. 16, 2012; link below), continuing to think about the Rutgers suicide (referenced in my last blog), and struggling to come to terms with the murder of Trayvon Martin in Florida. I was also inspired by New York Times columnist Frank Bruni’s op-ed recognition of the founding of ACT UP by Larry Kramer 25 years ago this month (“The Living after the Dying;” link below). During the rewriting of this piece, another of Bruni’s columns, “Rethinking His Religion,” likewise moved me deeply, as did Maureen Dowd’s piece on fathers and sons, “How Oedipus Wrecks.” (Links to both are also posted below.)

I mention these sources to provide context in advance of sharing something that is quite deeply personal. Only a handful of people have heard some of these stories, and I continue to wrestle with their influence and consequences today. I withheld some details while my parents were alive but find that now, in grieving for them, pathways to the past that were once blocked off are now open for travel once again.

Suicide, murder, death by disease: How do we confront and survive such horrors in the world, especially those that are motivated and/or perpetuated by forces that are sometimes close to home, such as racism and homophobia? For me, the answer has always been clear. One at a time, we share our stories and experiences, both real and imagined. We listen, and then we respond. Together, we learn about lives different from our own—through fiction, poetry, music, painting, and every other creative endeavor. In developing our senses of empathy and understanding, we do what we can to make the world a better place.

This, then, is my personal response to many of the tragedies we have heard about in the news recently. It is a true story of survival and redemption, an exploration of how one person (me) found the strength to endure in such difficult and often dangerous times.

"Encounter/Exposure" — Multiple-exposure self-portrait taken back in the day.

 What Didn’t Kill Me

“They should all be rounded up, taken into a field, and shot.”

That was my father back in the late 70’s, responding to a news story about Dade County, Florida, and the ordinance it had passed prohibiting discrimination on the basis of sexual orientation. Citrus Queen Anita Bryant was no doubt on the television stirring up homophobic sentiment among the growing crowds who had come to hear her rally against the ordinance under the guise of the innocent-sounding group “Save Our Children.” The homosexuals, Bryant shouted into her microphone, must be stopped. My father obviously agreed.

At the time, I didn’t fully understand what homosexuality was, even though I had some inkling that it was manifesting itself within me. “Fag,” “gay,” and “queer” were insults regularly hurled at me in school, not because I displayed any overt sexual interest in men, but because I was younger, weaker, and smarter than most of my peers.  “Fag,” “gay,” and “queer” were also insults regularly sneered by Archie Bunker on the classic sitcom “All in the Family.” Archie was something of a hero for my father, and to this day, I doubt that he fully understood that the character was intended as a parody, his prejudices a cause for ridicule. When I would come home crying after occasional fights and beatings, my father would sternly advise me to “fight back like a real man.” It was clear to me that, in his eyes at least, I was at risk for becoming something “other” than a man.

I worked on developing a tough outer shell during my school years, mostly by immersing myself in music and shutting out the rest of the world. A quick look at my cassette mixes of the period reveals an obsession with punk rock and goth music. The glittering gaiety of disco held no interest or sway with me. I wanted rhythms and melodies that were darker and more complex, more in keeping with my shaded heart. I favored love songs in which the pronouns and protagonists were gender-neutral.

In my blue-collar hometown, there were no “gay, lesbian, bisexual, transgendered , and questioning youth” clubs or alliances in school. (N.B.: I’ll be shortening that list of identifiers to the commonly used “GLBTQ” throughout this entry, with apologies to those who question the arrangement of those letters.) There was no “welcoming congregation” in any of the local churches. I knew only one openly gay person, a disco-obsessed coworker with crude mannerisms who often showed up at work either drunk or stoned. There was no Web or Internet to provide any additional information about homosexuality or to engage in any social networking. There was mostly the television, with its occasional news story about gay-bashing and raids on “deviant sex clubs,” and my father’s vocal support of killing all the queers that the cops rounded up. Though I suspected that she disagreed, my mother remained silent.

I would often retreat to the basement or my bedroom, playing strange and obscure albums I had heard and read about during my job at a local music store. These would become the soundtrack to my adolescent dialogues with God, the ones in which I prayed for a change in my character while simultaneously praying for the cute guy in calculus class to notice me, the ones in which I would ask God why He had made me the way I was if my existence supposedly offended Him. If, as my mother had taught me, God was the source and sum of all human love, why was my love not a valid part of that equation?

One night, during my senior year, I concluded that there was no possible reconciliation between who I was and who I should be. Despite all the high grades in school and my teachers’ reports of exemplary conduct, I was more evil than good. I was a blot in God’s perfect world, an aberration, a mistake. Maybe, I thought, my father was right. Maybe people like me were better off dead.

And so, on a sudden impulse, I reached for the nearest weapon I could find—in this case, a dagger hanging on my bedroom wall. To be honest, it was a decorative dagger, purchased during a school-sponsored trip to Spain a year earlier. Even so, it was metal and its point was sharp. It would certainly do the trick. Without any second thoughts, I closed my eyes, held the dagger out with both hands, and plunged it toward my heart.

The heels of my hands punched into my chest. There was pain, but not the deep stab I’d expected. It felt more like a tiny bite, a bee-sting.

I looked down to see the blade bent flat and harmless against my chest. The tip had drawn blood, but left little more than a nick in the skin.

Inside, I heard a voice: “No, not this. Not you.” God was speaking directly to me. In that instant, I had found a faith deeper and truer than any I had previously known. My beliefs no longer relied on old-world hand-me-downs from parents and priests. I had just witnessed an actual miracle, a real-life experience imbued with divine grace and meaning. (Some years later, a college physician would explain the “miracle” in more clinical terms: I have an unusual bone spur located directly beneath my sternum, an abnormality that had been either absent or at least undetected throughout my entire life. Make of that what you will.)

To this day, I don’t know who or what had prevented my suicide—and yes, let’s be honest; that’s the word for what I had attempted. Even so, whatever happened that day left me with a renewed awe and reverence for this extraordinary gift of life.

In other words, it made me stronger.

Getting Stronger

Fast-forward to the University of Montana in the early 90s. A lesbian couple and I had been invited into a sociology classroom to discuss whether GLBTQ people should be allowed to adopt children. At the time, I was a graduate student in the English department, an “out” gay man to most of my friends and colleagues yet not to the students in the composition classes I taught.

A woman near the back raised her hand. “I don’t have any problem with you all being gay and what not,” she said. “But I just don’t think it’s fair to do that to the kids.” She paused, perhaps waiting for a murmur of support from her classmates. “I mean, the other kids would just tease them and make fun of them. It just wouldn’t be fair to the children.”

Ah yes; the children. Recollections of Anita Bryant sparked a prickly heat in my chest, but I tried to remain cool and calm on the outside. In the most even tone I could muster, I asked the woman, “Would that be your kids doing the teasing? Because I fail to see how someone else’s bad parenting skills should suddenly become my problem and prevent me from raising my own children with the decent values I intend to teach them.”

The woman squirmed in her seat. I could imagine her offense at my remarks: This class was supposed to be about queer people and their homosexuality, not straight people and their homophobia. Inasmuch as I was tired of straight people talking like experts about homosexuality, I was also tired of being a homosexual asked to talk about homophobia. Why did the task of dealing with gay-bashing—and other examples of bias-based violence, to some extent—always seem to get shoved back onto the backs of the targets of such violence? Hadn’t we been burdened with enough in our lives?

Yes, we had. But then, as I witnessed time after time among a number of GLBTQ friends and allies, we were that much stronger because of it. Through mutual support and sheer determination, we had weathered multiple storms and faced the next ones head-on. We had the courage of our convictions and the strength of our beliefs—convictions and beliefs that had been tested repeatedly, with all of that resulting in a hard-won faith in our God-given characters and abilities.

I recalled that sociology-class encounter as I read a Rolling Stone article about a recent string of teen suicides in the Annoka-Hennepin school district in Minnesota. The story received national attention primarily because of the district’s representative, former presidential candidate Michele Bachmann, and her Bryant-like crusade against homosexuality. She and her cohorts had promoted policies within the school district that prohibited any mention or discussion of homosexuality, even when it was crucial to addressing any instances of bullying or harassment. Sensitive GLBTQ students understood the underlying message: homosexuality was a sin, an evil in society, a sickness so despicable that it should not even be discussed, in public or in private. In a twisted logic reminiscent of rape cases in which the victims are blamed for provoking the attack, these students were advised to tone down their own behaviors and just be more careful around bullies. One victim noted that while teachers reprimanded students who used racial slurs, homophobic insults went unchallenged.

Recent protests around both the Annoka-Hennepin suicides, the Trayvon Martin murder, and other tragedies show that many people in society—perhaps even the majority, at long last—are sick of suffering such losses in silence. In calling for an end to toxic mindsets like homophobia and racism, justice-seeking people are now standing their ground and demanding that the law protect them from real threats such as armed vigilantes and over-zealous politicians. Community organizers and activists continue to work to create a safer, saner, and more supportive world for the next generation—not one in which ignorance masquerades as authority in order to diminish or extinguish the lives of others.

For years after coming out as a gay man, I had been working to achieve similar peaceful goals. In addition to organizing support groups for GLBTQ youth, I worked with the leaders of church congregations to make them more welcoming places. I wrote articles and edited a statewide GLBT newspaper. (“Q” wasn’t in the masthead at the time, and the arrangement of the other terms was always a hot-button topic.)

As a balance to activism, I engaged in some subversion as well. Together with a group of gay and lesbian cohorts, I infiltrated a local chapter meeting of noted homophobe Phyllis Schlafly’s Eagle Forum. At the end of that event, the chairperson came by to thank us for attending. “Your presence here tonight has really opened my eyes,” she said. I noted that all of us had recited the Pledge of Allegiance together at the start of the meeting, and that I for one proudly believed each and every word. “Liberty and justice for all,” I repeated, placing extra emphasis on “for all.” She nodded; she understood. Despite our initial fears of one another, we could connect after all.

In the weeks leading up to my father’s death last June, I told him many of these stories that I am telling you now. (I did not, however, tell him about the suicide attempt, nor about how his own behavior had played a major role in that.) I had been spending as much time with him as I could, trying my best to tolerate his constant cigarette smoking even as cancer, heart disease, and numerous other ailments waged war within his body.

It came as no surprise to me that in addition to football and baseball, some of his favorite television programs were courtroom reality shows: “Judge Judy, “ Judge Joe Brown,” “Judge Jeanine Pirro.” Every so often, a gay or lesbian plaintiff or defendant would take the stand. Their appearances on the show seemed quite normal and never elicited comment, positive or negative, from my father.

But then, just days before his death, my father delivered one final judgment.

We had just spent hours talking about love, sharing stories about the people who had mattered most to us in our lives while a Red Sox rebroadcast droned on in the background. After confessing our secret boyhood crushes, he glanced at the clock. “My God,” he said. “It’s way past midnight!”

Instead of wrapping up the conversation and saying our goodnights, my father raised his hand to hold my attention. “One more thing,” he said. He looked down at the floor, then back up at me. “You know, you’re a lot stronger than I ever gave you credit for. Don’t you ever let anyone take that away from you.”

His words came as a shock. I had never expected them. In fact, I had long ago convinced myself that I did not need them. Because of that, he and I had often kept our distance—both figuratively and literally. And yet, at long last, my father had looked deep inside me and approved of what was there.

This growing approval had been evident over many visits made by my husband and me. Together, the three of us had shared all sorts of stories, from my dad’s wartime adventures in the Phillipines to our relatively bucolic tales of life in Vermont. His happiness for us was genuine, especially when he heard that our friends and neighbors had fully accepted us into their small-town community.

No doubt, my father had feared it would be otherwise. He knew firsthand how strong a force prejudice and homophobia could be in the world. And yet now, at last, he knew another truth: a person could be stronger than that. The fact that I could be such a strong person is, I now realize—despite its tough and tortured origins—my father’s legacy to me.

Epilogue: “It Gets Better”

In response to the numbers of GLBTQ youth who are harassed and bullied in schools around the country, the writer Dan Savage recently utilized the power of YouTube to create inspirational videos in which speakers tell stories focused on the theme “It Gets Better.”

Though the story I have told above might sound like something you’d find there, I have to express my reservations about the project. I don’t mean to say that the thousands of stories told on the site aren’t helpful or valuable. They are, and I applaud each and every person who has added his or her positive voice to the site.

Even so, the phrase “It gets better” sounds so, well, weak to me. As a teenager, I was savvy to simplistic platitudes like that. They sounded like so much wishful thinking, Hallmark cards for the marginalized. What is the pronoun “it” referring to, anyway? And what kind of action verb is “get”?

I know, I know…the writer/editor in me shouldn’t be so fussy when lives are at stake.

But here’s the deal: Sometimes it doesn’t get better. Sometimes the boy doesn’t get the boy, the girls don’t get the adoption approval, or the operation doesn’t turn out the way someone had hoped. Hearing the words “it gets better” just makes it sound like we should all wait patiently and ride out the storm. It’s a passive construction, and, as the stories above should illustrate, I’ve had it with passive. In crisis situations, I crave more control of my life, not less.

So, in addition to (but not in place of) “It Gets Better,” I offer these words of support: “You Grow Stronger.” And if you don’t believe me, just listen to that Kelly Clarkson song again. Or read Friederich Nietzsche.

Better yet, I am proud to report, you can take my father’s word for it.

UPDATE:

Since the publication of the original article in Rolling Stone, the Anoka-Hennepin School District finally developed a better (though not perfect) policy in order to “promote a respectful learning environment” in its schools. You can read about the new policy here in their newsletter:

http://www.anoka.k12.mn.us/education/components/whatsnew/default.php?sectiondetailid=233410&itemID=45742

 

LINKS

“School of Hate” in Rolling Stone link:

http://www.rollingstone.com/politics/news/one-towns-war-on-gay-teens-20120202

Frank Bruni links:

“The Living After the Dying”

http://www.nytimes.com/2012/03/18/opinion/sunday/bruni-the-aids-warriors-legacy.html?_r=1&scp=2&sq=frank%20bruni%20ACT%20UP&st=cse

“Rethinking His Religion:”

http://www.nytimes.com/2012/03/25/opinion/sunday/bruni-a-catholic-classmate-rethinks-his-religion.html?_r=1&hp

Maureen Dowd: “How Oedipus Wrecks” link:

http://www.nytimes.com/2012/03/25/opinion/sunday/dowd-how-oedipus-wrecks.html?ref=opinion

“It Gets Better” link:

http://www.itgetsbetter.org/

“The Normal Worldview”

Print by Catherine Rondthaler. More of her stunning work can be seen at http://catherinerondthaler.com/_MG_6169.html.

This week, two separate news stories caught my attention along with the rest of the country’s: The retraction by “This American Life” of Mike Daisey’s fabricated account of Chinese laborers making Apple products, and the conviction of Rutgers student Dharun Ravi in relation to the suicide of his gay roommate, Tyler Clementi. Both stories, it seems, ignited quick responses in those who heard/read/saw them, and yet both stories have deep, deep backstories that complicate them well beyond the comfort zones of most casual media consumers. Each has the capacity to be told as an epic novel, and yet many people have formed their opinions—some quite strong and unshakable—based on the equivalent of a short story or a Wikipedia synopsis.

(At the risk of hypocrisy, I feel obliged to provide a brief overview here for those who are unfamiliar with the cases. If you are familiar with them, my apologies; sorry to interrupt; skip ahead to the next paragraph, please. Links also appear at the bottom of this entry for those seeking more information. 1) Mike Daisey, a theatrical monologuist, had appeared on Ira Glass’s NPR show “This American Life” to offer his supposedly first-hand accounts of the terrible working conditions at a massive Chinese factory that produced Apple’s iPad. After the show received its strongest listenership to date, the producers discovered that many details in Daisey’s account were false, at which time they retracted the story. They devoted a follow-up episode to the incident, during which Daisey offered an awkward defense of his position. 2) In the Rutgers case, Ravi was convicted of bias-motivated invasion of privacy—a hate crime, in other words—after using a Webcam to spy on his roommate, Clementi, during a romantic encounter and sharing some of the images and his thoughts with friends. Clementi later jumped from the George Washington Bridge, though his reasons remain either unknown or undisclosed. The suicide note he left behind has not, as of this writing, been released. Subsequent accounts of the story’s details had been altered in order to portray Ravi as a homophobic bully who served as the catalyst for Clementi’s suicide.)

Without question, these two stories are charged. They touch nerves, raise passions, and spark debate. When you look closely, there is no easy way to “read” either one, and yet their appeal seems to stem from just that: quick and often preconceived judgments. The stories have been appropriated and exploited in many ways for a number of agendas: corporate negligence, workers’ rights, cyber-bullying, gay harassment. In some ways, they have been revised and rewritten to fit theme-based narratives, and somewhere along the way, for whatever reasons, the facts gave way to fiction.

At the same time, two essays about writing recently appeared in The New York Times: Annie Murphy Paul’s “Your Brain on Fiction” and Jhumpa Lahiri’s “My Life’s Sentences” (links below). The first draws its inspiration from recent studies of the human brain and how it responds to figurative language (not necessarily fiction, despite the emphasis in the title); the second is a more rhapsodic account of the writing process by a Pulitzer-prize-winning author. The first seeks insight and evidence gathered through scientific research; the second in mystical experience relayed through a series of often contradictory metaphors (“Sentences are the bricks as well as the mortar, the motor as well as the fuel. They are the cells, the individual stitches. Their nature is at once solitary and social.”). Both tend toward an elevation of creative writing as something beyond the realm of the ordinary or commonplace, a glorification that was subsequently approved and forwarded by my many literary Facebook friends. Both were also listed among the top ten e-mailed articles on the day of and the day after their publication.

To provide some context, I’ve been thinking about these four examples (Daisey/Ravi/Paul/Lahiri) as I read through hundreds of fiction and nonfiction manuscripts submitted as applications to the Bread Loaf Writers Conference, a task I’ve undertaken for well over a decade now. Much of this writing comes from published authors; some are current teachers of creative writing at prestigious colleges. There is always an implicit tension between the fiction and nonfiction camps that can best be summed up in two competing bits of wisdom. The first, paraphrased by the poet Lord Byron in Don Juan, states, “’Tis strange—but true; for truth is always strange; Stranger than fiction.” The second, included in a moment of critical reflection within Dostoevsky’s novel The Idiot, muses, “Authors, as a rule, attempt to select and portray types rarely met with in their entirety, but these types are nevertheless more real than real life itself.”

The two editorial pieces in The New York Times would seem to support these statements. I have also heard both statements used in reference to accounts of the Daisey and Rutgers stories. In both instances, political themes apparently trumped a proper respect for the facts of the matter.

Truth can be stranger than fiction. Fiction can be more real than real life itself. These two statements create something of a logical conundrum. Can truth be stranger than truth itself? No wonder the fake television talk-show host Steven Colbert felt compelled to popularize the concept of “truthiness.” We should remember, however, that each of these thoughts and ideas arose within the creative realm. They are the musings of fictionalized characters created by a poet, a novelist, and a comedian. Even though many might consider them to be simple aphorisms and adages, they are often considered gospel within the literary worldview.

The uncomfortable thing about aphorisms and adages is that society tends to be two-faced about them. We favor one when it suits us, then favor the opposing view when it seems more appropriate—or, in this context, “truer.”

Consider the following: Absence makes the heart grow fonder. Ah yes, true. But then: Out of sight, out of mind. So true. A stitch in time saves nine./Haste makes waste. Actions speak louder than words./The pen is mightier than the sword. You get the idea. You see little sayings like this posted all the time on Web sites and Facebook updates, and more often than not, a chorus response of “So true!” will follow.

A reliance on simple sayings like these—as well as simplistic sayings disguised as profound wisdom, of which there are many more examples—tends to create a rather tenuous and fragile worldview. That’s why it shocks us so deeply when someone like Mike Daisey comes along and throws rocks inside the glass house. (Sorry for that pun. I couldn’t resist.) Daisey offered up a fiction as fact, and people wanted to believe him so badly that they willingly suspended their disbelief. (Yes, that “willing suspension of disbelief for the moment, which constitutes poetic faith,” as Samuel Taylor Coleridge once wrote. More literary gospel.) Ira Glass even admitted that because the details of Daisey’s story seemed so true, the fact-checkers gave him a pass when they couldn’t corroborate his claims with the interpreter/character mentioned in his monologue (whose name he had lied about and whom he claimed could not be reached, when in fact she was easily contacted by another journalist).

Likewise, a piece of investigative journalism by Ian Parker in The New Yorker (link below) calls into question any simplistic understanding of the Ravi/Clementi case. As a piece of “literary” writing (which I’ll simplify here to mean writing that includes effective characters, setting, plot, and theme), the article surpassed nearly all of the fiction I have read over the past years. It stood as an example of why, over the past decade, my own “poetic faith” in literature has been eroding and I have been unwilling to suspend my disbelief in most (though assuredly not all) fictional contexts.

I have, in essence, become a literary skeptic. Inasmuch as our artistic culture constantly blends the two (consider the current use of the term “creative nonfiction”), I teeter on a tightrope when I read. I look to the author to provide a steady, unswerving wire along with a pole for balance, and when I feel that I have these things, I am able to walk between the towers of fact and fiction, a feat accomplished to great effect in Colum McCann’s extraordinary novel Let the Great World Spin. (Just read the opening pages and see for yourself.) Mostly, perhaps, I look to the author for honesty and authority, two traits that don’t always play well with one another.

In searching for the most powerful voice—for a “totality” of fabricated details when the truth was more than sufficient to drive his point home—the author/writer Mike Daisey created a narrator/character, also named Mike Daisey, who twisted the facts. Then, like many contemporary authors before him, he dared us, the audience, to tell the two apart. The only problem here was that he did not tell his audience that he was playing a game of “Truth or Dare.” In fact, he never hinted that we were playing a game at all.

As the adages above demonstrated, you can’t have it both ways. Attempts to do so only demean both sides of the equation: the essential integrity of journalism (here comes more fodder for the “lamestream media” claim) and the transformative power of creative literature (here comes more fodder for the “academic elitists” claim). No doubt the dialogue between the two camps (literature and journalism, art and reality, fact and fiction) can and should continue, but it needs to be honest. It should welcome scrutiny rather than resist critical efforts. In that way, we can preserve spaces for both the brain-based analysis of figurative language as well as the mystical creation claims recently published in The New York Times. Otherwise, they risk coming across as mere parlor tricks promoted by charlatans.

Below is an excerpt from the transcript of the retraction episode of “This American Life” (link also below). I don’t include it as a way of providing closure, however. This discussion needs to continue beyond its most recent manifestations, just as it continued beyond the now historical cases of James Frey’s “creative memoir,” A Million Little Pieces, and the falsified news reports of another Glass, Stephen, in the 1990s. For me, however, the takeaway from the excerpt below is the title of this blog entry itself, “The Normal Worldview.” What great fodder for a million more little blogs…

 Ira Glass: Like, you make a nice show, people are moved by it, I was moved by it, and if it were labeled honestly, I think everybody would react differently to it.

Mike Daisey: I don’t think that label covers the totality of what it is.

Ira Glass: That label – fiction?

Mike Daisey: Yeah. We have different worldviews on some of these things. I agree with you truth is really important.

Ira Glass: I know, but I feel like I have the normal worldview. The normal worldview is when somebody stands on stage and says ‘this happened to me,’ I think it happened to them, unless it’s clearly labeled as ‘here’s a work of fiction.’

RELATED LINKS:

“This American Life” original episode (transcript):

http://www.thisamericanlife.org/radio-archives/episode/454/transcript

“This American Life” retraction episode:

http://www.thisamericanlife.org/radio-archives/episode/460/retraction

News article about the Dharum Ravi conviction:

http://www.nytimes.com/2012/03/17/nyregion/defendant-guilty-in-rutgers-case.html


“The Story of a Suicide” (Parker):

http://www.newyorker.com/reporting/2012/02/06/120206fa_fact_parker

“Your Brain on Fiction” (Paul):

http://www.nytimes.com/2012/03/18/opinion/sunday/the-neuroscience-of-your-brain-on-fiction.html?src=me&ref=general

“My Life’s Sentences” (Lahiri):

http://opinionator.blogs.nytimes.com/2012/03/17/my-lifes-sentences/?src=me&ref=general

Liar, Liar, Pants on Fire, Hang Them Up on a Telephone Wire

"2 Men Looking On" from the "Perched" series by artist Jodi Chamberlain. Her multimedia work can be found at http://www.jodichamberlain.com (just click on the painting). Copyright by Jodi Chamberlain; used with permission of the artist.

By way of an introduction, I did some Internet research on the origin of the expression “Liar, Liar, Pants on Fire, hang them up on a telephone wire” and found this reference to an 1810 poem by the English writer William Blake:

 The Liar

Deceiver, dissembler
Your trousers are alight
From what pole or gallows
Shall they dangle in the night?
 
When I asked of your career
Why did you have to kick my rear
With that stinking lie of thine
Proclaiming that you owned a mine?
 
When you asked to borrow my stallion
To visit a nearby-moored galleon
How could I ever know that you
Intended only to turn him into glue?
 
What red devil of mendacity
Grips your soul with such tenacity?
Will one you cruelly shower with lies
Put a pistol ball between your eyes?
 
What infernal serpent
Has lent you his forked tongue?
From what pit of foul deceit
Are all these whoppers sprung?
 
Deceiver, dissembler
Your trousers are alight
From what pole or gallows
Do they dangle in the night?

If you do your own Internet research (and of course you do; you’re probably verifying this on Google right now), you’ll find numerous references to this Blake poem, some of them in fairly reliable places. There’s only one small problem: William Blake didn’t write it.

Additional research has yet to turn up the actual author of the poem or its place and date of origin. I did find, however, that the word “alleged” had at one time been attached to the claims of Blake’s authorship. Some of those who later reposted the poem and its origin myth carefully edited out the word “alleged,” perhaps to boost their own authority. I’ll leave it to the Blake scholars to hash out why this could or could not possibly be an obscure verse from the poet who gave us “Tyger tyger, burning bright”—which, when you say it loud, does share some strong poetic similarities with “Liar, liar, pants on fire” after all.

• • •

I am constantly amazed and dismayed by the number of people who both post and spread unproven statements and “facts” on the Internet. Quite a few of these people are close friends, many of them remarkably intelligent people. Even so, they often clamor to be the first ones to expose some new controversy or conspiracy via Facebook or Twitter. Maybe it’s the journalist in all of us, desperate for a scoop, but if there’s one thing I learned while staying with my ailing father and watching endless episodes of Judge Judy, Judge Jeanine, and Judge Joe Brown, it’s that hearsay is irrelevant in a court of law.

Let’s stop a moment to take a look at that word, hearsay. We hear something, then we say it. From ear to mouth in an almost direct line that bypasses the brain. It’s a key ingredient of soap operas and celebrity gossip, yet it also sadly infects the serious discussions of the day’s major issues. Perhaps in this technological age we need new words for hearsay, such as readshare or seetweet. “I read those statistics on someone’s Facebook wall, so I immediately ‘shared’ them by reposting on my wall.” “I saw a sports commentator on television talking about that big trade that might happen, and so I tweeted that it actually had because he sounded so convincing.”

Another classic example is the Reverend Martin Luther King, Jr. quote that went viral immediately after the killing of Osama bin Laden. Here’s the sentence in question: “I mourn the loss of thousands of precious lives, but I will not rejoice in the death of one, not even an enemy.” I have no problem with the quote itself, but King never said it. All credit is due to Jessica Dovey, who prefaced an actual King quote with that sentence. In its original form, she even set her own thought outside of the actual quotation, clearly distinguishing her words from his. (The teacher in me can’t help but point this out: Punctuation matters!) As the quote ricocheted around the Internet, however, the quotation marks got knocked off and the two speakers’ sentiments became one. Since the Internet never forgets, there are now thousands—if not millions—of lingering misattributions.

After this incident, I began to wonder just how many of those inspirational/motivational “posters” that people share contain actual quotes from the the pictured sources. With today’s technology, anyone with a basic knowledge of photo editing can attribute just about anything to anyone and fool millions in the process. Just look at all those often-shared fake Ryan Gosling quotes. It’s all in good fun…until someone starts attributing hateful or hurtful language to him. (Chances are probably pretty high that someone already has.) Then what started as parody or satire slides uncomfortably across a blurry line into libel and slander.

I know I come across as a killjoy here, but on matters like this, I reveal my true colors as a classic skeptic. I reserve judgment until I can ascertain the facts of the matter, and I remain open to the possibility that objective reasoning may lend credence and support to multiple sides of an argument, even those that might contradict my own previously held beliefs. Like William Blake and many other artists of the Romantic era, I seek a direct link to the primary source—though, devout skeptic that I am, I question his subsequent claims to a mystical intimacy with a supremely divine being. Perhaps his position in literary history—at the crossroads of the Enlightenment (or Age of Reason) and the Romantic era (or Age of Emotion)—describes my own inner identity crisis as a writer. I would also argue that it describes the manner in which America currently teeters as a political force in the world, and our casual (not causal) relationship with the truth seems to be a reliable measure of that tension.

As an educational writer, I am often governed by a remarkably strict and lengthy set of mostly objective state standards and guidelines, even as I am asked to write fairly sentimental pieces, such as articles about developmentally challenged children overcoming adversity. Age of Reason, meet Age of Emotion. For a recent series of writing assignments, I was required to provide two credible and reliable sources for each sentence (yes, each sentence) in nonfiction articles. When and where possible, I was to cite the primary source—in other words, an eyewitness account or an original text, not some second-hand or third-hand citation or discussion. After all, elementary school standards (i.e. grades one through six) stress the significance and importance of primary sources in research. This is not “elitist, ivory tower” graduate-school-level academia stuff.

In my own college-level classrooms, I likewise impressed upon students the necessity of primary sources in validating claims, especially as we grappled with sensitive and controversial issues such as abortion, capital punishment, and religious freedom. By extension, I reminded them that primary sources were crucial in evaluating the claims made by advertisers and politicians, both of which would often seek to twist the truth to suit their own agendas. By further extension—and hoping to avoid turning those young people from skeptics into cynics—I reminded them that they needed to evaluate the primary source itself and take into account any conflicts of interest. After all, the makers of Brand X are likely to promote studies that show that their product is more effective than Brand Z, especially when those studies have been planned and paid for by the makers of Brand X.

I had hoped that this kind of education, common in most schools and colleges, would serve the students well in their adult lives and help them to make objective, reasonable, and justifiable decisions on all matters great and small, from the election of presidents to the selection of a new toothpaste. (Is it more important to remove plaque, avoid gingivitis, eliminate bad breath, or have whiter teeth? Honestly, a skeptic can spend hours analyzing the endless varieties of toothpastes offered by each brand these days.) I wanted students to realize the great potential and power they had to sort through all of the distractions and clamor around them in order to get at the truth, even if that earned truth challenged their preconceptions and ideological dispositions. In other words, I wanted them to be open to new information and perspectives, even if this led them to change their minds. After all, the human capacity for change and adaptation is a key aspect of our forward-moving evolution. Yes, I said it— and if you don’t believe in change or evolution, enjoy the view from your rut.

In the end, however, it appears that preconceptions and ideological dispositions still get the best of most of us. On social networks, people on the left post and spread unverified stories and statistics that seem to prove their political points; people on the right post and spread unverified stories and statistics that seem to prove their political points. Some of these are easily proven to be hyperboles, misstatements, or outright lies; others are carefully twisted and redirected versions of the truth (the process we all know as “spinning”). In nearly all instances, however, the posters (and re-posters) risk looking either naïve, misinformed, ignorant, or just plain stupid when the claims are analyzed and the facts made clear.

That risk—and the willingness of so many people to accept it for the most trivial of reasons—fascinates me. Checking the validity of some of these claims and stories takes less than a minute, especially if you’re clued in to sites such as snopes.com and politfact.com. (Disclaimer: These sites are good places to check first, but on their own they are not definitive arbiters of right and wrong. They can, however, provide valuable leads in a search for those coveted primary sources I mentioned earlier.) Despite that, there is still the temptation to hit that “share” button, which takes but a split second and provides that little hit of adrenaline that information junkies seem to crave. After all, if they wait too long, someone else might scoop them and ruin their status as the first (if not primary) source of breaking news. Take that, FoxNews and CNN!

(Topical side note: The death of Whitney Houston on February 11 was a prime example of this. Even before reports had been confirmed, dozens of people raced to post the story as fact on their Twitter and Facebook feeds. Perhaps they all wished to be, if I may twist the New York Times’s catchphrase slightly, the “tweeters of record.”)

In many instances, perhaps, the risk of being wrong is offset by a firm belief that we are right. Even if the statement is later shown to be false, the underlying “truthiness” of it remains. (Thank you, Stephen Colbert, for having introduced that word into the public lexicon. We clearly needed it.) To explain this more plainly, the truth doesn’t need to actually be true as long it truthfully supports what we truly believe to be true. This is why politicians can make wild claims in our legislative chambers (such as the “fact” that Planned Parenthood’s primary mission is to provide abortions) and later claim “no harm no foul” because the easily disproven “fact” was simply alluding to a basic “truth” anyway. (If I had to allow such reasoning in my classes…well, I couldn’t and wouldn’t. It undermines the entire premise of education, not to mention making a travesty of logic and reason. Civilized people simply don’t entertain this kind of idiocy.)

When it comes to believing the unbelievable (or giving credence to the incredible), it’s not just the lies that people tell that intrigue me—it’s also the lies that they hear and subsequently believe or let pass unchallenged. This takes us back to the crux of the reposting problem: people believe and spread untruths because, in some strange way, they WANT the untruths to be true somehow. Testing such a claim would be evidence of weakness or a lack of confidence: If it feels or sounds right to you, it probably is—no further research necessary. In other words, if you’re a cynic, pointing out hypocrisy justifies your cynicism. If you’re paranoid, spreading conspiracy theories justifies your paranoia. If you’re (liberal/conservative), propagating the latest (conservative/liberal) scandal gives you that higher moral standing that you just know your side deserves. If you hate Company A or Team B, then that damning news about Company A or Team B simply MUST be true.

So let’s bring this full circle. Over the years, I’ve come to believe that you can learn a whole lot about a person from the lies he or she tells. Further, you can tell even more about that person from the lies he or she is eager to believe, especially at the risk of reputation and social standing. The more entrenched the belief, the less likely the person is to respond to reason. As a result, dissonance becomes an unavoidable consequence, and with that dissonance comes discomfort, anger, and sometimes violence. In all the hullabaloo, reason and logic are often the first casualties.

Perhaps a first-hand case study would help to illustrate the point. At a town-hall-style meeting a couple of years ago, Senator Bernie Sanders was taking questions from the crowd. One man, most likely a member of the Tea Party, asked him to explain why the government keeps raising his taxes. Sanders pointed out to him that President Obama had just passed tax cuts for the majority of wage-earners, but this man was insistent: His taxes had gone up. Sanders tried to reason with him, pointing out that he appeared to fit the demographic of people who received a tax cut, but no, there was no reasoning with the man. His taxes had gone UP. The discussion, as Sanders subsequently remarked in his typically brusque style, had reached an impasse and could not continue in that forum.

Had the man’s taxes gone up? Probably not. But here’s a pet theory I have about all of this. Had the amount of money being taken out of his weekly or monthly paycheck gone up? That seems likely, especially since many companies have been increasing the employee co-share of medical benefits as health insurance costs continue to rise. The disgruntled man might easily have lumped all of these paycheck deductions under the simple heading of “taxes,” giving him evidence, however misinterpreted, for his claim and belief. And what legislation had been introduced to try to deal with the problem of rising medical insurance costs? Why, the very legislation that members of the Tea Party are dead-set against.

The man at the town hall meeting was angry, and he wanted that anger to be heard and validated. When the facts didn’t serve that purpose, he was faced with a dilemma. He could change his mind, or he could stand his ground.

His final decision: ignore the facts. Therein lies the root of the word ignorance: someone who can be shown the truth yet insist on the lie. Someone who can be told that something is fiction and yet still believe that it’s fact. In this respect, ignorance is a worse trait than naïveté or stupidity.

Just consider the synonyms for the feeble-minded: words like dull and dim. Ignorant people are hardly dull or dim. They carry the full force of their beliefs behind them. Their faith in one unwavering, indisputable view of the world can be both violent and catastrophic. After all, when one person runs through the village with his pants on fire, he or she can spread the flames to every surrounding building. That makes the practice an emergency worthy of public concern.

Luckily, we are all able firefighters. With all this “Information Age” technology around us, the truth is closer to each and every person than it ever has been in the history of mankind. And yet, the most powerful tools can often be used both to create and to destroy. A hammer can help build a hope chest for our dearest possessions, but it can also break a window or kill a man. One post can spark the overthrow of dictators and the rise of democracy, but it can also ruin careers and destroy a country’s economy.

I’ll end this section with a paraphrase of Tim Gunn from the reality show “Project Runway”: “Use the wall thoughtfully.” He was talking about shelves of fashion accessories, but he might just as well have been talking about Facebook. Whatever we do, we should do it thoughtfully, with care and consideration. To do otherwise would be, quite simply, uncivilized.

(NOTE: There is a second and perhaps even third part to this post already underway…)

The Protestant Work-Study Ethic


Recently, Republican presidential candidate Newt Gingrich has been getting a lot of attention—and a standing ovation at a recent debate in South Carolina—for his insistence that students in low-income schools be given jobs to encourage their work ethic and to discourage them from lives of crime. Here is part of the original quote:

 “You have a very poor neighborhood. You have students that are required to go to school. They have no money, no habit of work. What if you paid them in the afternoon to work in the clerical office or as the assistant librarian? And let me get into the janitor thing. What if they became assistant janitors, and their job was to mop the floor and clean the bathroom?”

Now, even though Gingrich’s most recent religious affiliation is Catholicism (he has changed denominations over the years), his proposal reflects the Protestant Work Ethic as originally set forth by the German socio-economist Max Weber in 1904. Yes, despite its later association with the Puritans who came to America, the idea is a relatively recent European import. Simply put, the Protestant Work Ethic connects hard work with earthly gains that reaffirm one’s spiritual salvation. It is also, in Weber’s understanding, the foundational core of capitalism.

Gingrich’s remarks led me to recall several experiences from my own past that shaped how I would subsequently perceive issues such as labor and class. Since many of these relate to the idea of work in school, or “work-study,” I thought they might be useful in exploring the topic from another angle. (Note that these are primarily anecdotal responses; they are not meant to be definitive studies of the sociological or economic theories at play here.)

The hard work paid off...

I was relatively naïve about issues of class (among other things) all the way into my late teens. The people in my neighborhood, a somewhat distant suburb of Boston, all seemed fairly equal in terms of family income and social status, as did most of the students in my public school. In my limited experience, “private school” meant “Catholic school,” and so seemed more of a religion-based option rather than something based on economic opportunity or educational ideology.

That all changed when I went off to college. I had worked hard in school, and my diligence paid off in the form of scholarships and financial aid awards. The prospects of having to work while I studied were nothing new; I had worked part-time as a dishwasher and record store sales clerk while attending high school. I understood the character-building significance—and financial reward—of earning my way, if only partially. After all, college wasn’t cheap, and I had my heart set on Middlebury College, one of the more expensive ones.

While at Middlebury, I held down various jobs: dining hall food-slinger (and later shift manager), professorial assistant, music librarian, data processing keypuncher, residential house director, janitorial helper, student tour guide, reunion host, and term paper typist (to pay for the typewriter I had purchased for school). As you might imagine, many of these were simultaneous, so my schedule was a patchwork of classes and work commitments. Thank God I wasn’t on a sports team, because there wasn’t much time left in the day or on weekends for practices and games. At the end of most days, while many of my peers headed downtown to drink or off to parties on campus, I finally had a chance to hit the books and do my homework. Around midnight, it was not uncommon for a last-minute “rush” typing job to come in, and so I could often be found at 2 or 3 in the morning, typing another student’s paper while they slept. We had no Red Bull back then, and I hadn’t yet begun my coffee addiction. For me, the work ethic boiled down to this: Work hard; then work some more.

Before you conclude that this is a “woe is me” tale, let me state this: I loved being so busy. Sure, I complained about some of the long hours and overcommitments, but I also met some of my best friends (both students and staff members) on the job, learned how to prioritize and multi-task, and got an insider’s education on how an American college operated. Many of these new friends were from backgrounds completely unlike my own, which broadened my perspectives in all directions. Plus, there was the pride of the paycheck at the end of each two-week period, even though most of it went right back to the college. (I guess this truly distinguished me from the “elites,” who, according to Gingrich the other day, are the only ones who “despise earning money.” Perhaps he meant that they prefer inheriting it? Like many others, I’m still scratching my head over that comment.)

At first, I was under the impression that everyone worked this hard at college. (Yes, Mr. Gingrich, the Protestant Work Ethic values had been instilled in me from an early age.) Then I learned what a “legacy” was: someone who got into college based, in no small part, on previous family ties. I learned what a “trust fund child” was: someone who had seemingly unlimited income from their parents, none of it earned through hard work. I came to understand more about private schools and how they functioned, to some extent, as a feeder system for the Ivy League and other top-quality colleges—if one’s family could afford private school tuition in the first place. This was all new to me, and so it took a while to adjust to the idea that friends of mine had multiple homes, routinely flew to exotic locations, or had jobs waiting for them the moment they stepped out of college.

That said, I often look back and wonder how much better I could have done in college if I hadn’t been scooping out mashed potatoes and ice cream for three to four shifts a week. I wonder how many more networking connections with influential people I could have made if I was hanging out at the local bar or fraternity at night rather than typing up those fellow Midd Kids’ term papers. Even though I managed a fair number of extracurricular activities, I wonder how many more I could have undertaken were my schedule not so riddled with the responsibilities of the jobs I held. I also wonder what post-graduation prospects a final, senior-year semester might have yielded. (I graduated a semester early to save some money on tuition.) And when Mr. Gingrich suggests that low-income students should be saddled with similar responsibilities while simultaneously trying to get an education, I wonder if he realizes what an unlevel playing field he is proposing, and the extent to which such a field favors the already-rich.

This was never more apparent to me than the day I was asked to sit down with Middlebury’s financial aid director to address a problem with my assistance package. Her reason for seeing me was short and sweet: I had to stop working. I had reached the upper cap for my work-study aid, and so could no longer receive any paychecks from the college. Slightly confused, I asked if I could continue working outside of the auspices of the financial aid office, as some other students did. The answer: No. Why? Because I was on financial aid.

“So let me get this straight,” I said to her, struggling to comprehend the conundrum. “Because I don’t have much money, I’m limited in the amount of money I can make. But if I did have money and didn’t need financial aid, I could make as much as I want.”

That was correct. But, of course, she pointed out, if I didn’t need financial aid, why would I work in the first place?

Hmmm. As Spock, the intergalactic king of logic, would say, “Fascinating.”

(As it turned out, the director’s paycheck was, in some ways, reliant on my own. Remember that job I had in data processing? Part of it was the computer entry of faculty and staff time sheets and salaries so that their checks could be printed on schedule. As my supervisor in the computer center pointed out to the financial aid director, I couldn’t be let go without jeopardizing the entire college payroll system. Somehow, over the course of the next few days, an “adjustment” was made to my financial aid work-study limit. Interesting side note: one of the other student keypunchers, likewise entrusted with some rather hefty responsibilities as part of her work-study award package, still works in that department to this day.)

Since that encounter, I remained keenly aware of how work-study programs affect the college community. It didn’t take long to witness a casualty of the program.

Based on how well my work-study experience had qualified me for post-college positions (another plus of the system), I was hired as something of an apprentice dean at Middlebury. I was mainly in charge of student housing, but my job description also required that I serve as an academic adviser to members of the incoming freshman class.

That was when I met Rachel (not her real name), a smart and vibrant first-year student with the instant likeability of a budding movie star. She was also a financial aid recipient and holding down a work-study job while meeting the demands of a rigorous academic schedule. Rachel had been incredibly nervous about starting out at Middlebury. She had also been accepted to a state school, and her family felt certain that they could afford four years there. Middlebury, however, with its higher prestige and much higher tuition, was a gamble, especially since financial aid at the time was awarded on a year-to-year basis. Who knew what assistance, if any, Rachel might receive in years two, three, and four? (I liken this, in some regard, to all those corporations demanding that tax rates and government regulations remain the same for years on end so that they can engage in long-term planning. Seen from this perspective, you can understand their point.)

Fast-forward three years from my one-year position at Middlebury. Back in Massachusetts, a friend and I had just seen a movie and were settling into a booth at a fast-food restaurant for post-film discussion. Who should come up to take our order but Rachel.

I was incredulous. This was the middle of the school year; why wasn’t she up in Vermont attending classes? As Rachel told me, she hadn’t received nearly enough financial aid assistance to afford a second year of Middlebury. Her grade-point average had slipped due to her work-study overload, which in turn made her ineligible for a merit-based scholarship. As a result, she had dropped out of school completely. Her family didn’t have enough money left over from the first year at Middlebury to cover the tuition at the state school. Now Rachel was living back home, working multiple jobs, and trying to make enough money as quickly as possible so that her Middlebury credits wouldn’t expire before they could be transferred to another college. Simply put, her earlier financial gamble hadn’t paid off.

As you might imagine, the entire college financial aid process has changed a great deal over the years. Some of it has no doubt improved upon the earlier models, while other changes (such as the relationship to need-blind admissions policies) have led to greater uncertainty and unpredictability.

Even so, Newt Gingrich’s comments reminded me of another, subsequent episode at Middlebury College. This time, I had returned to the campus to become administrative director of the Bread Loaf School of English, a graduate program of the school. Based on my earlier experience as a “baby dean,” I was invited to join the college’s Diversity Committee. At one of those meetings, I raised the matter of work-study and how it reinforced some rather negative stereotypes and expectations on campus in regards to minority students. One of my former colleagues from the Dean of Students office disagreed with my observations, and so I invited her to join me on a quick tour of the campus.

Together, we walked into the college snack bar and the main library. I asked her to look around and tell me what she saw. It took only a moment for the shock to appear on her face. Behind the counters and scrambling about the stacks, foreign and minority students were working hard, taking orders and reshelving books. Their “customers” were mostly white and affluent-looking students. There on campus, the intersection of diversity initiatives and work-study programs had created a microcosm of the American service industry. White, privileged students enjoyed the luxury of free time between academic and/or athletic commitments, while nonwhite students labored to meet those same demands in addition to burdensome economic challenges.

This, too, has probably changed over time, or at least I hope that it has. Mr. Gingrich’s recent comments, however, raised concerns that some of those lessons remain unlearned.

College costs continue to rise; student debt grows more difficult to manage. As our country’s income inequality expands, the very idea of a level playing field for all, at least in the educational context, remains at risk. With it, some of our nation’s best and brightest—perhaps those most capable of envisioning and implementing solutions—may never rise to their full potential. For them, despite what politicians like Gingrich believe, the selective application of the Protestant Work Ethic just compounds the problem—and in the end, doesn’t really work well at all.

American Anger, Part One

Preface: This is Part One of what I hope will be an ongoing, potentially year-long exploration of this subject. The topic seems well-suited to the “blog” format, serving more as a catalyst for conversation rather than a definitive treatise on the topic. I look forward to continuing the conversation in hopes of reaching some constructive insights, conclusions, and potential remedies.

As you’ll no doubt quickly note, my take on American anger is a rather personal approach; your choices for taking on the topic may no doubt differ. Despite that, I’ll be using terms like “Americans “ and the first-person-plural pronoun “we” rather liberally throughout the entries. I do this merely as shorthand, fully aware that it’s literary sleight of hand, both a contrivance and a conceit. I don’t intend to suggest that there are absolute universal truths here, especially since the insistence on universal absolutes in society tends to generate the very anger I’ll be analyzing.

As always, thanks for reading, and even more thanks to those who respond to provoke or inspire further insight.

 1. Use Your Words

American anger fascinates me.

Here we are, billing ourselves as the “best, greatest, richest, most powerful” nation in the world, and yet people all over the country claim to be angry. Watching the growth of the Tea Party movement in 2010 was like watching the now-famous scene in Sidney Lumet’s 1976 film “Network” in which unstable talk-show host Howard Beale inspires his viewers to lift up window sashes across the country and shout out into the night: “I’m mad as hell, and I’m not going to take it any more!” Everyone was mad as hell for different reasons, but there was a feeling that bringing all that rage together into one unifying cry might make it either coherent or effective. (Spoiler alert: it didn’t.) In many ways, it echoed a couple of the poet Walt Whitman’s famous lines from “Song of Myself”:

            I, too, am not a bit tamed, I too am untranslatable,

            I sound my barbaric YAWP over the roofs of the world.

It was not a specific word or words that Whitman called out into the night; it was not an intelligible phrase or clause. It was a sound, an utterance, savage and undomesticated, more animal than human. In a way, Whitman was suggesting, people had been making those sounds for years and would continue for many more, well beyond his own eventual death. We might never come to know who he was or what he meant, but discussion about it “shall be good health to you nonetheless.”

In this election year, 2012, we are hearing quite a few YAWPS across the political landscape, some less tamed and translatable than others.

In addition to all the contemporary social and political dissent, there is a perhaps an even more powerful undercurrent of dissonance—the lack of a rational link between one’s beliefs and one’s reality, however either one is perceived. It’s the feeling we get when we pay top-dollar for something only to find that it’s cheaply made or ineffectual. We vote for a candidate based on his or her promises only to find those promises later ignored. (To provide some continuity between this blog and an earlier entry on football’s “Tebow Time” phenomenon, dissonance was that sickening feeling the hyper-religious quarterback’s more fanatic fans experienced when the Denver Broncos were humiliated by the New England Patriots in a recent playoff game. For the sake of divisional fairness, it was also the sickening feeling the Green Bay Cheeseheads felt when Aaron Rodgers and the nearly-perfect Packers succumbed to the New York Giants the very next day.)

I’ll be talking much more about dissonance and its relation to anger later on, but it’s worth mentioning here just to keep the idea in mind as the discussion of anger progresses.

As Americans, we see anger glorified throughout our culture, from movies to music, sports to politics. Despite our supposed Judeo-Christian foundation, we have movements in the country that promote violence and greed over diplomacy and charity. As our young people’s generation comes to define itself (or, to put it in the passive voice, lets itself be defined by others) as “ironic,” it also grows indifferent to irony’s cousin, hypocrisy. Sarcasm provides an easy segue from skepticism to cynicism, providing many a political pundit on both ends of the political spectrum with the equivalent of sniper’s bullets.

When anger wears us down into a numbed state of depression, anger’s inward-turned doppelganger, we shrug our shoulders and try to focus our attention elsewhere. For some, this may translate into another glass of wine, another dose of Xanax, another marathon session watching the Real Housewives of Whatever County spit their venomous barbs at one another. Other folks may start in on the next level of “Angry Birds,” one of the highest-grossing games in our country. Or perhaps you want to take a virtual trip around the world—killing people and blowing things up along the way—in America’s top game of the Christmas season, Call of Duty: Modern Warfare 3. What a wonderful gift to commemorate the birth of the Prince of Peace. (See how easily the sarcasm comes?)

Many players of these games claim that such pastimes are cathartic—that they help “release tension” and “blow off steam” at the end of a stressful day. If that were truly the case, violent movies and first-person shooter games would leave viewers and players in states of blissful repose. Instead, they ramp up the emotions and boost the adrenalin. (Full disclosure: I play an occasional hour or two of “World of Warcraft” myself at the end of a busy day, so I know that to be successful as a warrior, you need to “generate rage.” It’s right there in the game manual.)

So maybe the term cathartic is a canard when we choose violence-based entertainment as a relief or release of our internal anger and frustration. I’d argue that the proper word is indulgent. Pressing further, I’d express concern that a more appropriate adjective might be catalytic. America seems to like things super-sized and hyper-accelerated, so it’s no surprise that when it comes to anger, amplification isn’t just acceptable; it’s preferable.

An admission: cathartic, indulgent, and catalytic are big words. I’m a writer, so I sometimes use big words. That’s because language, like anger, fascinates me. They’re both acts of expression that have rich, sometimes hidden, roots and origins. Example: I wrote a poem about one such instance, the word decimate. Many people think it means “to destroy completely and indiscriminately.” In fact, the word is based on the Latin root for the number ten and originally meant a methodical act of slaughter in which exactly one victim in ten was killed. (Ironic, eh?) The meanings of words may evolve over time, but the origins of their species are there for all to comprehend and appreciate.

But I digress. Let’s return to the notion of anger as a cathartic force and set forth a little thought experiment. Imagine that you’re a parent dealing with a red-faced child whose inexplicable rage has sent cereal, milk, and orange juice flying across the kitchen. To calm the child, would you—

  1. put on some soothing, New Age music and send the child into the corner for a five-minute “time out” period of self-reflection?
  2. tell the kid to march off to his/her room and go the f*ck to sleep?
  3. tell the child to imagine having an automatic weapon in his/her hands during a stressful, high-stakes combat mission whose outcome will determine the fate of all mankind?
  4. ask the child, “Why are you so angry?”

Now imagine America as a red-faced child.

Modern child-rearing gurus recommend option d. Many advise parents to respond to their children’s extreme behaviors with the expression “Use your words.” This doubles as both an encouragement of self-expression and a redirection of energy. It’s a graceful dance step that moves the child away from visceral reaction toward more cerebral creation. Emotions, meet intellect. Intellect, say hello to emotions.

To some, however, “use your words” is just so much poppycock. To quote the blogger MetroDad, a rather literate and opinionated New Yorker: “I think it’s a bullshit mantra that only helps raise the next generation of pussies.” Like it or not, that’s using your words.

In some ways, “use your words” promotes a form of therapy. It seeks to replace the outburst with what we might call the “inburst,” a breaking-and-entering of the psyche in order to see what secrets are hidden in the closets or nailed beneath the floorboards. We ask a child “what’s really bothering you?” with an expectation of stolen snacks or missing pets, but sometimes the answer shocks and surprises. I’d argue that this is true even when we as adults ask the question of ourselves.

It’s no surprise that many people view creative expression as a form of therapy. Just read the inexhaustible output of writers writing about writing, a quite profitable if overindulgent niche market. We’ve even “verbed” the word “journal.” Did you know that people who journal frequently are able to reduce their stress and manage their anger more efficiently? I could say the same thing about blogging, but then there’s that quote up above from MetroDad. (I kid MetroDad. His blog entries are actually quite amusing, entertaining, and even insightful.)

Too often these days, when it comes to using our words, people settle for quick fixes rather than deep introspection. It’s the 140-character Tweet of the daily pet peeve versus Plato’s lifetime of examination. I’m not suggesting that everyone sign up for therapy sessions, but I do ask friends and colleagues to strive for clarity and honesty in their communications. That often requires work. True expression isn’t effortless.

Even as I write this, I am surrounded by reference materials. As a writer, it often isn’t enough simply to “use your words.” As you’ve noticed, I often rely on the words of others, be they expressed in song or psalm, poetry or prose, book or blog. I would be lost without the dictionary, the thesaurus, the atlas, the encyclopedia, and the patient guidance of my editor/husband—even though all of those things can tempt me along time-consuming tangents with their fascinating insights. Likewise, I am inspired and guided by the works of scholars like Geoffrey Nunberg, whose books and NPR spots on language have both educated and entertained me. Honestly, how many of you get excited when you see an essay entitled “The Politics of Polysyndeton” Hands? Hands? Hello?…

My own fascination with language started in second grade, when my wonderful teacher Miss Burke introduced me to bookmaking with the simplest materials, and it has grown deeper ever since. Even so, one catalytic instant stands out. (Please, if you still don’t know what catalytic means, either look it up on your iPad’s dictionary or ask your car mechanic. After all, these elite, ten-dollar words aren’t reserved for professors holed up in their ivory towers. If you truly love your country, learn the English language. Have I made my appeal clear in both liberalese and conservatese?)

Elie Wiesel, winner of the Nobel Peace Prize in 1986 and the holder of a Congressional Medal of Honor, is another humanitarian hero of mine. Wiesel spent most of his life coming to terms with the violence, anger, and despair he witnessed as a concentration camp prisoner during the Holocaust. I heard him speak about his experiences shortly after he received the Nobel Prize. One of his responses during a question-and-answer session has haunted me ever since.

“Americans,” he stated matter-of-factly, “have one of the most violent languages in the world.”

The truth of that comment struck me. No…it hit me in the face. No…it blindsided me. No…it knocked me out. No…it fell on me like a ton of bricks. No…it blew my mind. No…it bowled me over.

Everywhere I went and everyone I talked to—suddenly, I was keenly aware of the insidious presence of anger and violence in everyday American language. On one occasion, I felt compelled to alert a pacifist minister to her repeated use of violent idioms and imagery in a sermon on compassion. She stood there dumbstruck (as we say), amazed by the horrible truthfulness of the comment.

For a while after hearing Elie Wiesel speak, I too felt dumbstruck, “made silent by astonishment” (to quote Webster). As a writer, I also felt aware in a way I had never felt or experienced before. The Buddhist in me smiled silently. Mindfulness, after all, is one of the key concepts of the practice, summed up simply in the popular mantra “Be here now.”

And so here I am, now, in an American culture defined (in part) by its reactionary anger toward so many things—including each other. I’m struggling to understand that anger, both in myself and in others, and to use my words to describe it. But what do we talk about when we talk about anger?

Defining anger, as I hope to demonstrate in the forthcoming part two, is no easy task, but it’s well worth the effort. Our fate as a nation, if I can ramp up the election year rhetoric, may actually depend on it.

• • •

Playlist for “American Anger”

“Music is food,” says my artist-friend James “Mayhem” Mahan, and so this post comes with a playlist for the full multi-media experience. These are songs that fed my mind as I considered this post and its upcoming parts. It’s also collaborative, so if you’re on Spotify, I encourage you to contribute as well as to listen. Mostly it’s for fun…testing once again how all of this interactive interconnected technology works. Enjoy.

  1. Green Day, “American Idiot”
  2. Public Image, “Rise”
  3. Nine Inch Nails, “Terrible Lie”
  4. Kanye West, “Monster”
  5. Florence and the Machine, “Kiss with a Fist”

(You can listen to and help build this playlist on Spotify here:

American Anger)