Unimaginable

The 9/11 Memorial always reminded me of a double Bat-signal rising up above Gotham City. (Photo from United States Government Works, public domain)

New York City, September 11, 2001.

After an invigorating morning swim and a walk to the office under clear blue skies, I’m selecting a muffin to go with my morning coffee when a cafeteria worker rushes in from the dining area. Nearly hyperventilating, he tells everyone that a plane has just crashed into one of the World Trade Towers.

We all hustle toward the windows and look to the south. Dense smoke billows out from the irregular shape now punched into the side of the north tower. We grasp at explanations. Our collective imaginations reach this consensus: the pilot of a small plane must have suffered a heart attack and lost control of his aircraft.

Shortly thereafter, in my own department eleven stories higher, I join my coworkers along the south-facing bank of windows. Speculation continues even as a second aircraft appears to the west, coming in low across the Hudson River at high velocity. We theorize that it must be a press-related plane rushing to the scene, but it’s coming in too fast and is too close to the building and, seconds later, slams into the south tower. A fireball erupts as it crashes through the building. The visual force of it shoves us back from the window.

After our many attempts to grasp what has been happening, we now face a more likely yet discomforting fact: We are under attack.

Throughout the morning, as we watch television broadcasts, monitor the Internet, contact friends and relatives, and eventually recoil in horror as the towers collapse one after the other, two dominant yet contradictory themes emerge:

“This is unimaginable.”

“It looks like a scene from a movie.”

Later that afternoon, in our attempts to navigate home through the paralyzed city, my husband and I crowd onto a bus heading uptown, away from the disaster. People sit and stand in silence, some marked as close witnesses by dust, soot, and ash. If we speak at all, we speak in broken whispers. Mostly, we settle into a state of numbness that will haunt us for days, weeks, generations.

When the bus stops to let passengers on and off, I look back toward the plumes of smoke rising above Manhattan. Already I’m exhausted by a sight that will dominate the view from my office window for a period beyond all expectation. I look down at the street, where I see a crumpled and dusty costume being run over repeatedly by buses, cabs, fire trucks, and police cars. I recognize the red-and-blue paneling and its black web motif. It is a Spiderman suit.

Where were the superheroes today? I ask myself.

The answer is obvious. Superheroes are make-believe. They’re fictions, fantasies. Yet somehow, even in the context of such stark realities, they seem entirely imaginable.

• • •

This is unimaginable.

It looks like a scene from a movie.

These comments echoed once again throughout the coverage of the shooting at the recent “Batman” movie premiere in Aurora, Colorado. In this instance, however, the comic-book costume in question was inhabited, come to life.

James Holmes, his hair dyed red like the villain the Joker, a gas mask concealing whatever crazed expression might have been on his face, his body encased in bulletproof armor, fired randomly and repeatedly into a smoke-filled theater and struck 70 people.

In Aurora, the police were quick to apprehend Holmes. With a mixture of luck and skill, they were also able to prevent any further damage that might have resulted from entering his booby-trapped apartment.

Even after capture, Holmes acted like someone “playing a role,” according to one police officer. In other words, he continued to engage in make-believe, pretending to be someone he wasn’t. He was acting out a fantasy.

He was imagining things.

• • •

At an early age, I decided that I could and would make a career out of imagining things. I read the fiction of Ray Bradbury, Italo Calvino, and Donald Barthelme with both appreciation and envy. In order to achieve true and lasting greatness, I would have to dream bigger things than my literary idols and spin from those dreams far greater works of fiction. To do that, I would have to create characters, settings, and situations that no one else had thought of before.

In short, I would have to imagine the unimaginable.

And so, after years spent crafting short stories, novels, and poems, the events of 9/11 left me feeling uneasy and strangely complicit. Hadn’t I created similarly sinister plots for my own antagonists? Truth be told, I saw nothing “unimaginable” about flying an airplane into a building as an act of terrorism. Quite the opposite: It seemed rather simplistic.

In the days and weeks after the attack, I grew annoyed when people in both the media and our government used the word “unimaginable” to describe the events. It seemed a sloppy description, as much a dodge as a denial.

What I found truly unimaginable was that the people in charge of defending against such things hadn’t done their jobs and anticipated this scenario. (In fact, some intelligence agents had, but few people believed enough in their reports to act on them.) I wanted to suggest that the soon-to-be-formed Department of Homeland Security should employ fiction writers like Tom Clancy and Robert Ludlum—any writers, really—to help them out a bit in the imagination department.

The Aurora shooting incident, however, represents the flip side of this argument. Here, a work of imaginative fiction inspired the assailant. In his twisted state of delusion, James Holmes latched onto the Joker, a character created by comic-book writers in the 1940s, and erased the border between fantasy and reality. He obsessed over the Batman mythology but followed the wrong role model.

Authorities made easy comparisons between Holmes’s behavior and Heath Ledger’s portrayal of the Joker in the film “The Dark Knight,” which earned him a posthumous Academy Award. Critics praised the depth of Ledger’s commitment to the role. Sources on the movie set expressed their astonishment at the intensity of Ledger’s investment in the character. In an interview at the time, Ledger himself described the role as “fun,” adding: “There are no real boundaries to what the Joker would say or do. Nothing intimidates him, and everything is a big joke.” (Empire magazine, January 2008)

“Well, I warned him.” According to one report (New York Daily News, January 24, 2008), this was Jack Nicholson’s response on learning about Heath Ledger’s death by overdose a short while later. (Nicholson had taken on the Joker role in Tim Burton’s 1989 film “Batman.”) Nicholson knew the emotional and psychological toll involved in playing the Joker, and Ledger’s subsequent health issues suggested that he had been haunted by the role well after the film had wrapped.

Ledger’s death raises uncomfortable questions for those in the creative arts. Are there some places in the imagination that we shouldn’t go? And even if there are, how do we prevent ourselves from going there? Just because we may refuse to imagine something, does that make it unimaginable?

• • •

“With great power there must also come–great responsibility!”

This insight, currently attributed to Stan Lee and the Spiderman franchise but thought to originate with the French thinker Voltaire, provides a guiding light whenever I have my doubts about writing and the imaginative impulse. It reminds me that any ability, like any tool, can be used for good or bad, to create or destroy. A person can use a hammer to build a home or break a window. Radioactive material can be used to destroy cancerous cells or bring an empire to its knees.

With so much potentially at stake, it becomes tempting to sit out the debates that we need to have—with each other and with ourselves—in the wake of events such as 9/11 and Aurora and countless other acts of violence. In postponing action—or, in some cases, shrugging and offering up a feeble, defeatist “Well, what can you do?” attitude—many of our appointed leaders deny their own power and, as a result, shirk their responsibilities. They fail us, plain and simple.

In the resulting void, opportunists rise to the debate like sharks to bloody chum. Some amplify their pre-existing fears and hatreds, which in turn can incite other acts of violence. (Witness the recent shooting in Wisconsin.) Others cultivate a false sense of security or superiority with cynicism or trendy ironic detachment, a kind of toxic snark. Neither suffices for the level of serious discourse required. Both, in their own ways, risk being irresponsible.

The day before the Aurora incident, I began work on a fictional story about guns and violence inspired by Ishmael Beah’s book Long Way Home: Memoirs of a Boy Soldier. The day after the shooting, my progress stalled. I could imagine what might happen next in the story, but part of me simply didn’t want to any more. The subject felt too overwhelming, and my efforts to address it seemed insignificant. Like young Peter Parker after his uncle’s murder in the Spiderman origin story, I questioned my own powers and abilities. That superhero suit? An empty costume after all.

Slowly, the paralysis subsided. Silence was no longer an option, nor was surrendering to self-doubt or cynicism. I began to write again, and, in doing so, attempted to create the empathic connection that explores all sides of an issue, not just the easy or most comfortable side.

I could not expect a superhero to swoop in and write that next difficult scene for me, just as we cannot leave our hopes for a more just and peaceful world in the hands of some mythical caped crusader. In the real world, our daily challenges remain ours and ours alone. Luckily, we have the power to respond to those challenges in creative and positive ways. If we can imagine ways to channel that power responsibly, then together we can save the world.

Advertisements

Callings

Hugh Jr. and Hugh Sr. outside Gillette Stadium, one of the last photos taken of my dad

 

“You missed your calling.”

Fathers say perplexing things sometimes, and Fathers’ Day seems like as good a day as any to remember them. In a solemn moment, my father offered the above observation several years ago, and it haunts me the way an echo haunts a canyon—mainly because I’d been turning the same notion over and over in my mind for many years before that.

Some people are born lucky and have but one calling that speaks to them loudly and clearly all their lives. When I see these people—and they show up with a refreshing frequency on reality shows such as “American Idol” and “So You Think You Can Dance,” the latter being one of my absolute favorite television experiences of all time—I feel stirrings of both kinship and envy. I admire their devotion and dedication to their talents and consider the depth of my own past commitments to the written word. At the same time, I am reminded of how often my loyalty to literature has slipped and faltered. There too, echoes from the past resound.

I grew up in an era when “doctor” or “lawyer” were the two top attainable career goals for one’s children. Sure, “president” got mentioned fairly frequently, but I could always detect a catch in my parents’ voices when they said it. There was realism and practicality—common sense, my father frequently extolled—and then there was the dreamer’s realm of fantasy—those “pie in the sky” aspirations that might tempt us for a while but would ultimately bring us up short. As anyone who grew up playing the game of “Life” knew: you drew the largest paychecks from “doctor” and “lawyer,” and “president” wasn’t a viable option on the game board.

Alas, from an early age, I chose another option that wasn’t on the board: the dreamer’s realm of fiction writing.

When I watch the interviews on talent shows, two types of participants often reduce me to tears: contestants whose family members so fully support their goals that they’ll pack up the minivan and drive across the country together for the auditions, and contestants whose families have abandoned them and left them to follow their artistic aspirations alone (and let’s be honest, when that aspiration is something like dance, there’s more than a little homophobia at play in many of the reactions).

“You’ll never make a decent living at that,” my father warned me when I first mentioned my plans to become a fiction writer. “Do you really think someone would want to publish something like that?” he asked after reading one of my short stories. “I’d be ashamed of myself,” he added—his subjective take on the choice of a front-and-center gay narrator, perhaps.

It was the last time I would show him any of my work. Years later, after that story had been published along with several others, he summed up his feelings by unknowingly paraphrasing the final line of a James Wright poem that has haunted many an aspiring writer:

 

I lean back, as the evening darkens and comes on.

A chicken hawk floats over, looking for home.

I have wasted my life.

 (from “Lying in a Hammock at William Duffy’s Farm in Pine Island, Minnesota”)

 

My father may have been drunk when he informed me that I had wasted my life; after all, he rarely called me when he was sober. Even so, his feelings were made quite clear when he demanded that I forego my own graduate-school commencement ceremony in order to come home and attend my sister’s graduation from medical school. At the party following that event, my father mentioned matter-of-factly to a friend that I, too, had just graduated from school, the University of Idaho. The only problem with that was that I had just received—in absentia, of course—my MFA degree from the University of Iowa, the premiere writing program in the country.

All water under the bridge, I told myself as I headed back to my hometown a year ago to spend extended periods of time with my father in his final months. During those stays, the inevitable questions arose: “Why didn’t you become a doctor?” “Did you ever think of becoming a lawyer?”

I answered them dutifully and honestly, in order: “I can’t stand the sight of blood,” and “Yes, and I haven’t ruled it out entirely just yet.”

These questions about medicine and law were, in their own way, high compliments. He acknowledged that I had done well enough in school to pursue and excel in either one. By then, however, my father understood that neither career option reflected my true calling. He still wasn’t completely sold on the idea of writing, and I confessed that there were far too many days when I wasn’t, either.

Even so, my father had one request to make. He asked if I would write something exclusively for him: his eulogy.

My father did not ask me to do this because of some deathbed epiphany that his son was, indeed, a writer. He was harking back to his final words in that conversation from years earlier: “You have missed your calling.”

At the time, we had been discussing matters of the soul and spirit, and my father was suddenly filled with the belief that I should have gone into the ministry. “That’s your true calling,” he said. “You should be writing sermons instead of short stories.”

Now, nearly a year after his death, I reply: “Dad, at their best, they are one in the same, just like us.”

Book Value

When I was young, I borrowed most of the books I read from the local library. My public high school provided dog-eared copies of the classics in my college-prep literature classes. To further satisfy my growing reading appetite, I would ride my bicycle downtown to a local hardware store that kept a few ramshackle shelves of coverless remaindered paperbacks near the front door. For forty cents each (or three for a buck), I could pedal home with a combination of Mark Twain’s Adventures of Huckleberry Finn, Ray Bradbury’s Something Wicked This Way Comes, and the not-quite-latest collection of greatest Mad magazine gags (stupid answers to stupid questions, anyone?).

To this day, I continue to haunt second-hand bookshops, yard sales, and the remainder bins at big-box stores in search of bargains. At the end of the semiannual book sale at our local library, I came home with two brown paper shopping bags full of wonderful finds, including the collected poems of Stanley Kunitz, Anne Lamott’s Traveling Mercies, Chang-Rae Lee’s The Surrendered, Tab Hunter’s Confidential, and a couple of titles by Bill O’Reilly (stupid answers to stupid questions, anyone?). I am not exaggerating when I tell you that the bookcases in which those books now reside cost substantially more than the books themselves.

In each one of the instances noted above, my reading experience provided absolutely no compensation to the authors of the books themselves.

This is why I wince a little every time fellow lovers of literature rhapsodize about the beauty and allure of used bookshops. I cringe a bit when poets, novelists, and journalists extol the value and necessity of public libraries. I wince and cringe with the heart of a co-conspirator, for I, too, value and support these institutions. And yet by patronizing both of them, readers rob the arts to a greater extent than purchases made at the Amazons and Barnes & Nobles of the world. Access to free, remaindered, or used books may boost readership, but each one deprives writers of compensation for their work. Seen another way, they literally lower the value of the written word.

I learned at an early age that writers should expect little, if anything, in the way of recognition or remuneration for their efforts. Mostly one should expect rejection—either in the form of “thank you, but not for us” letters from potential publishers or “nice try, but not for me” comments from readers. Many literary writers still subscribe to the somewhat romantic “starving artist” archetype, envisioning themselves in tiny, unheated lofts subsisting entirely on grilled cheese sandwiches, flat soda, and watered-down tomato soup as they churn out page after page of unappreciated brilliance.

One might expect that things would have improved over the past few decades, but they haven’t, at least not in the publishing industry. In fact, the situation has become even more psychologically unhealthy. These days, many magazines and journals require hopeful writers to pay a “reading fee” for the opportunity to be considered for publication. In a number of instances, specious contests have replaced the normal submission process. The literary world has become more like a lottery in more ways than one. What was once called the “table of contents” in some books and journals might now be better referred to as a “list of winners.”

Editors and publishers will tell you that reading fees and contest are necessary evils in order to meet their financial demands. After all, there are bills to pay: editorial staff, office assistants, designers, printers, distributors, and so on. On top of that, one must pay for electricity, water, heat, telephone, and so on.

Now consider that for many literary journals, the person who provides the actual content, the writer, is often paid very little, if at all. Despite having potentially invested in college tuition, workshop or residency costs, and all those reading/entry fees, publication rarely means “hitting the jackpot.” A poem that took fifteen to twenty hours to write might net ten dollars. A historical piece that required months of research, writing, and revising might reward you with three or four copies of the obscure journal that accepted it.

With that in mind, you might wonder why anyone in his or her right mind would pursue a literary career in the first place. Perhaps this explains why so many writers place their full and earnest faith in the metaphorical value of writing. In this scenario, writers are akin to mystics who channel universal truths onto the page. This probably explains why some writers’ conferences can seem like cults to non-bookish outsiders—and why some attendees can come across as desperate martyrs. I’m reminded of a lighthearted quip a fellow writer once told me on sharing his latest piece: “I suffer for my art; now you can suffer, too.”

I worry that this kind of attitude—this self-indulgent elevation of “literature”—further devalues the entire creative enterprise. It fosters a disconnect between the writer on high and the lowly reader, which in turn leads to dwindling book sales and meager showings at readings. Many of us who remain committed to the written word nonetheless felt a twinge of recognition and understanding when someone posted on Facebook: “April is National Poetry Month, that time of year when we should all force ourselves to attend dozens of readings and pretend to be enjoying ourselves.”

So you see, in some ways I’m still a terrible supporter of the arts. I read the latest articles about the financial struggles in the publishing world and can’t escape the sinking feeling that, in some ways, we brought this on ourselves. For decades, writers have accepted disproportionate compensation for their work without much complaint. As readers, we became more and more accustomed to paying little—if anything—for the works we’ve consumed. And in a bizarre twist of hypocrisy, we voiced our strongest support for those institutions that hurt writers the most. With that in mind, the crumbling nature of the publishing world today shouldn’t surprise anyone.

Still, I hold onto a shred of optimism about the future. Here are a few final thoughts and suggestions, and I welcome others:

  • Don’t be deluded—and I mean that literally, not figuratively. Stop buying into harmful myths like the “starving artist” I mentioned earlier. Demand that writers be paid for their work, and if it has to be a small amount, at least ensure that it’s not an insulting or demeaning amount. If a magazine or journal can come up with a business plan that covers the costs of printing and production, it should also account for the value of the content itself—the author’s compensation.
  • Do what you can to help publishers meet these increased financial challenges by subscribing to those magazines and journals whose work you respect and admire. (I know, I know: Once again, it’s up to the writers to cough up more money in the hopes of supporting their own potential paychecks. Maybe you can get friends and family members to subscribe as well, or convince some rich benefactor to bequeath millions to the journal in question.)
  • Support your local library not only by checking out books but by getting out your checkbook. That way they can purchase more books and journal subscriptions.
  •  Realize that free public readings aren’t really “free” and support both the presenter and the host bookstore by purchasing book(s).
  •  Enjoy your bargain book from the remainder bin, but support the writer with a full-price purchase of another title if you liked his or her work.
  •  Most of all, value what you do, whether you read or write. If literature does indeed have worth, then we should all be ready and willing to pay for it.

“The Normal Worldview”

Print by Catherine Rondthaler. More of her stunning work can be seen at http://catherinerondthaler.com/_MG_6169.html.

This week, two separate news stories caught my attention along with the rest of the country’s: The retraction by “This American Life” of Mike Daisey’s fabricated account of Chinese laborers making Apple products, and the conviction of Rutgers student Dharun Ravi in relation to the suicide of his gay roommate, Tyler Clementi. Both stories, it seems, ignited quick responses in those who heard/read/saw them, and yet both stories have deep, deep backstories that complicate them well beyond the comfort zones of most casual media consumers. Each has the capacity to be told as an epic novel, and yet many people have formed their opinions—some quite strong and unshakable—based on the equivalent of a short story or a Wikipedia synopsis.

(At the risk of hypocrisy, I feel obliged to provide a brief overview here for those who are unfamiliar with the cases. If you are familiar with them, my apologies; sorry to interrupt; skip ahead to the next paragraph, please. Links also appear at the bottom of this entry for those seeking more information. 1) Mike Daisey, a theatrical monologuist, had appeared on Ira Glass’s NPR show “This American Life” to offer his supposedly first-hand accounts of the terrible working conditions at a massive Chinese factory that produced Apple’s iPad. After the show received its strongest listenership to date, the producers discovered that many details in Daisey’s account were false, at which time they retracted the story. They devoted a follow-up episode to the incident, during which Daisey offered an awkward defense of his position. 2) In the Rutgers case, Ravi was convicted of bias-motivated invasion of privacy—a hate crime, in other words—after using a Webcam to spy on his roommate, Clementi, during a romantic encounter and sharing some of the images and his thoughts with friends. Clementi later jumped from the George Washington Bridge, though his reasons remain either unknown or undisclosed. The suicide note he left behind has not, as of this writing, been released. Subsequent accounts of the story’s details had been altered in order to portray Ravi as a homophobic bully who served as the catalyst for Clementi’s suicide.)

Without question, these two stories are charged. They touch nerves, raise passions, and spark debate. When you look closely, there is no easy way to “read” either one, and yet their appeal seems to stem from just that: quick and often preconceived judgments. The stories have been appropriated and exploited in many ways for a number of agendas: corporate negligence, workers’ rights, cyber-bullying, gay harassment. In some ways, they have been revised and rewritten to fit theme-based narratives, and somewhere along the way, for whatever reasons, the facts gave way to fiction.

At the same time, two essays about writing recently appeared in The New York Times: Annie Murphy Paul’s “Your Brain on Fiction” and Jhumpa Lahiri’s “My Life’s Sentences” (links below). The first draws its inspiration from recent studies of the human brain and how it responds to figurative language (not necessarily fiction, despite the emphasis in the title); the second is a more rhapsodic account of the writing process by a Pulitzer-prize-winning author. The first seeks insight and evidence gathered through scientific research; the second in mystical experience relayed through a series of often contradictory metaphors (“Sentences are the bricks as well as the mortar, the motor as well as the fuel. They are the cells, the individual stitches. Their nature is at once solitary and social.”). Both tend toward an elevation of creative writing as something beyond the realm of the ordinary or commonplace, a glorification that was subsequently approved and forwarded by my many literary Facebook friends. Both were also listed among the top ten e-mailed articles on the day of and the day after their publication.

To provide some context, I’ve been thinking about these four examples (Daisey/Ravi/Paul/Lahiri) as I read through hundreds of fiction and nonfiction manuscripts submitted as applications to the Bread Loaf Writers Conference, a task I’ve undertaken for well over a decade now. Much of this writing comes from published authors; some are current teachers of creative writing at prestigious colleges. There is always an implicit tension between the fiction and nonfiction camps that can best be summed up in two competing bits of wisdom. The first, paraphrased by the poet Lord Byron in Don Juan, states, “’Tis strange—but true; for truth is always strange; Stranger than fiction.” The second, included in a moment of critical reflection within Dostoevsky’s novel The Idiot, muses, “Authors, as a rule, attempt to select and portray types rarely met with in their entirety, but these types are nevertheless more real than real life itself.”

The two editorial pieces in The New York Times would seem to support these statements. I have also heard both statements used in reference to accounts of the Daisey and Rutgers stories. In both instances, political themes apparently trumped a proper respect for the facts of the matter.

Truth can be stranger than fiction. Fiction can be more real than real life itself. These two statements create something of a logical conundrum. Can truth be stranger than truth itself? No wonder the fake television talk-show host Steven Colbert felt compelled to popularize the concept of “truthiness.” We should remember, however, that each of these thoughts and ideas arose within the creative realm. They are the musings of fictionalized characters created by a poet, a novelist, and a comedian. Even though many might consider them to be simple aphorisms and adages, they are often considered gospel within the literary worldview.

The uncomfortable thing about aphorisms and adages is that society tends to be two-faced about them. We favor one when it suits us, then favor the opposing view when it seems more appropriate—or, in this context, “truer.”

Consider the following: Absence makes the heart grow fonder. Ah yes, true. But then: Out of sight, out of mind. So true. A stitch in time saves nine./Haste makes waste. Actions speak louder than words./The pen is mightier than the sword. You get the idea. You see little sayings like this posted all the time on Web sites and Facebook updates, and more often than not, a chorus response of “So true!” will follow.

A reliance on simple sayings like these—as well as simplistic sayings disguised as profound wisdom, of which there are many more examples—tends to create a rather tenuous and fragile worldview. That’s why it shocks us so deeply when someone like Mike Daisey comes along and throws rocks inside the glass house. (Sorry for that pun. I couldn’t resist.) Daisey offered up a fiction as fact, and people wanted to believe him so badly that they willingly suspended their disbelief. (Yes, that “willing suspension of disbelief for the moment, which constitutes poetic faith,” as Samuel Taylor Coleridge once wrote. More literary gospel.) Ira Glass even admitted that because the details of Daisey’s story seemed so true, the fact-checkers gave him a pass when they couldn’t corroborate his claims with the interpreter/character mentioned in his monologue (whose name he had lied about and whom he claimed could not be reached, when in fact she was easily contacted by another journalist).

Likewise, a piece of investigative journalism by Ian Parker in The New Yorker (link below) calls into question any simplistic understanding of the Ravi/Clementi case. As a piece of “literary” writing (which I’ll simplify here to mean writing that includes effective characters, setting, plot, and theme), the article surpassed nearly all of the fiction I have read over the past years. It stood as an example of why, over the past decade, my own “poetic faith” in literature has been eroding and I have been unwilling to suspend my disbelief in most (though assuredly not all) fictional contexts.

I have, in essence, become a literary skeptic. Inasmuch as our artistic culture constantly blends the two (consider the current use of the term “creative nonfiction”), I teeter on a tightrope when I read. I look to the author to provide a steady, unswerving wire along with a pole for balance, and when I feel that I have these things, I am able to walk between the towers of fact and fiction, a feat accomplished to great effect in Colum McCann’s extraordinary novel Let the Great World Spin. (Just read the opening pages and see for yourself.) Mostly, perhaps, I look to the author for honesty and authority, two traits that don’t always play well with one another.

In searching for the most powerful voice—for a “totality” of fabricated details when the truth was more than sufficient to drive his point home—the author/writer Mike Daisey created a narrator/character, also named Mike Daisey, who twisted the facts. Then, like many contemporary authors before him, he dared us, the audience, to tell the two apart. The only problem here was that he did not tell his audience that he was playing a game of “Truth or Dare.” In fact, he never hinted that we were playing a game at all.

As the adages above demonstrated, you can’t have it both ways. Attempts to do so only demean both sides of the equation: the essential integrity of journalism (here comes more fodder for the “lamestream media” claim) and the transformative power of creative literature (here comes more fodder for the “academic elitists” claim). No doubt the dialogue between the two camps (literature and journalism, art and reality, fact and fiction) can and should continue, but it needs to be honest. It should welcome scrutiny rather than resist critical efforts. In that way, we can preserve spaces for both the brain-based analysis of figurative language as well as the mystical creation claims recently published in The New York Times. Otherwise, they risk coming across as mere parlor tricks promoted by charlatans.

Below is an excerpt from the transcript of the retraction episode of “This American Life” (link also below). I don’t include it as a way of providing closure, however. This discussion needs to continue beyond its most recent manifestations, just as it continued beyond the now historical cases of James Frey’s “creative memoir,” A Million Little Pieces, and the falsified news reports of another Glass, Stephen, in the 1990s. For me, however, the takeaway from the excerpt below is the title of this blog entry itself, “The Normal Worldview.” What great fodder for a million more little blogs…

 Ira Glass: Like, you make a nice show, people are moved by it, I was moved by it, and if it were labeled honestly, I think everybody would react differently to it.

Mike Daisey: I don’t think that label covers the totality of what it is.

Ira Glass: That label – fiction?

Mike Daisey: Yeah. We have different worldviews on some of these things. I agree with you truth is really important.

Ira Glass: I know, but I feel like I have the normal worldview. The normal worldview is when somebody stands on stage and says ‘this happened to me,’ I think it happened to them, unless it’s clearly labeled as ‘here’s a work of fiction.’

RELATED LINKS:

“This American Life” original episode (transcript):

http://www.thisamericanlife.org/radio-archives/episode/454/transcript

“This American Life” retraction episode:

http://www.thisamericanlife.org/radio-archives/episode/460/retraction

News article about the Dharum Ravi conviction:

http://www.nytimes.com/2012/03/17/nyregion/defendant-guilty-in-rutgers-case.html


“The Story of a Suicide” (Parker):

http://www.newyorker.com/reporting/2012/02/06/120206fa_fact_parker

“Your Brain on Fiction” (Paul):

http://www.nytimes.com/2012/03/18/opinion/sunday/the-neuroscience-of-your-brain-on-fiction.html?src=me&ref=general

“My Life’s Sentences” (Lahiri):

http://opinionator.blogs.nytimes.com/2012/03/17/my-lifes-sentences/?src=me&ref=general

Liar, Liar, Pants on Fire, Hang Them Up on a Telephone Wire

"2 Men Looking On" from the "Perched" series by artist Jodi Chamberlain. Her multimedia work can be found at http://www.jodichamberlain.com (just click on the painting). Copyright by Jodi Chamberlain; used with permission of the artist.

By way of an introduction, I did some Internet research on the origin of the expression “Liar, Liar, Pants on Fire, hang them up on a telephone wire” and found this reference to an 1810 poem by the English writer William Blake:

 The Liar

Deceiver, dissembler
Your trousers are alight
From what pole or gallows
Shall they dangle in the night?
 
When I asked of your career
Why did you have to kick my rear
With that stinking lie of thine
Proclaiming that you owned a mine?
 
When you asked to borrow my stallion
To visit a nearby-moored galleon
How could I ever know that you
Intended only to turn him into glue?
 
What red devil of mendacity
Grips your soul with such tenacity?
Will one you cruelly shower with lies
Put a pistol ball between your eyes?
 
What infernal serpent
Has lent you his forked tongue?
From what pit of foul deceit
Are all these whoppers sprung?
 
Deceiver, dissembler
Your trousers are alight
From what pole or gallows
Do they dangle in the night?

If you do your own Internet research (and of course you do; you’re probably verifying this on Google right now), you’ll find numerous references to this Blake poem, some of them in fairly reliable places. There’s only one small problem: William Blake didn’t write it.

Additional research has yet to turn up the actual author of the poem or its place and date of origin. I did find, however, that the word “alleged” had at one time been attached to the claims of Blake’s authorship. Some of those who later reposted the poem and its origin myth carefully edited out the word “alleged,” perhaps to boost their own authority. I’ll leave it to the Blake scholars to hash out why this could or could not possibly be an obscure verse from the poet who gave us “Tyger tyger, burning bright”—which, when you say it loud, does share some strong poetic similarities with “Liar, liar, pants on fire” after all.

• • •

I am constantly amazed and dismayed by the number of people who both post and spread unproven statements and “facts” on the Internet. Quite a few of these people are close friends, many of them remarkably intelligent people. Even so, they often clamor to be the first ones to expose some new controversy or conspiracy via Facebook or Twitter. Maybe it’s the journalist in all of us, desperate for a scoop, but if there’s one thing I learned while staying with my ailing father and watching endless episodes of Judge Judy, Judge Jeanine, and Judge Joe Brown, it’s that hearsay is irrelevant in a court of law.

Let’s stop a moment to take a look at that word, hearsay. We hear something, then we say it. From ear to mouth in an almost direct line that bypasses the brain. It’s a key ingredient of soap operas and celebrity gossip, yet it also sadly infects the serious discussions of the day’s major issues. Perhaps in this technological age we need new words for hearsay, such as readshare or seetweet. “I read those statistics on someone’s Facebook wall, so I immediately ‘shared’ them by reposting on my wall.” “I saw a sports commentator on television talking about that big trade that might happen, and so I tweeted that it actually had because he sounded so convincing.”

Another classic example is the Reverend Martin Luther King, Jr. quote that went viral immediately after the killing of Osama bin Laden. Here’s the sentence in question: “I mourn the loss of thousands of precious lives, but I will not rejoice in the death of one, not even an enemy.” I have no problem with the quote itself, but King never said it. All credit is due to Jessica Dovey, who prefaced an actual King quote with that sentence. In its original form, she even set her own thought outside of the actual quotation, clearly distinguishing her words from his. (The teacher in me can’t help but point this out: Punctuation matters!) As the quote ricocheted around the Internet, however, the quotation marks got knocked off and the two speakers’ sentiments became one. Since the Internet never forgets, there are now thousands—if not millions—of lingering misattributions.

After this incident, I began to wonder just how many of those inspirational/motivational “posters” that people share contain actual quotes from the the pictured sources. With today’s technology, anyone with a basic knowledge of photo editing can attribute just about anything to anyone and fool millions in the process. Just look at all those often-shared fake Ryan Gosling quotes. It’s all in good fun…until someone starts attributing hateful or hurtful language to him. (Chances are probably pretty high that someone already has.) Then what started as parody or satire slides uncomfortably across a blurry line into libel and slander.

I know I come across as a killjoy here, but on matters like this, I reveal my true colors as a classic skeptic. I reserve judgment until I can ascertain the facts of the matter, and I remain open to the possibility that objective reasoning may lend credence and support to multiple sides of an argument, even those that might contradict my own previously held beliefs. Like William Blake and many other artists of the Romantic era, I seek a direct link to the primary source—though, devout skeptic that I am, I question his subsequent claims to a mystical intimacy with a supremely divine being. Perhaps his position in literary history—at the crossroads of the Enlightenment (or Age of Reason) and the Romantic era (or Age of Emotion)—describes my own inner identity crisis as a writer. I would also argue that it describes the manner in which America currently teeters as a political force in the world, and our casual (not causal) relationship with the truth seems to be a reliable measure of that tension.

As an educational writer, I am often governed by a remarkably strict and lengthy set of mostly objective state standards and guidelines, even as I am asked to write fairly sentimental pieces, such as articles about developmentally challenged children overcoming adversity. Age of Reason, meet Age of Emotion. For a recent series of writing assignments, I was required to provide two credible and reliable sources for each sentence (yes, each sentence) in nonfiction articles. When and where possible, I was to cite the primary source—in other words, an eyewitness account or an original text, not some second-hand or third-hand citation or discussion. After all, elementary school standards (i.e. grades one through six) stress the significance and importance of primary sources in research. This is not “elitist, ivory tower” graduate-school-level academia stuff.

In my own college-level classrooms, I likewise impressed upon students the necessity of primary sources in validating claims, especially as we grappled with sensitive and controversial issues such as abortion, capital punishment, and religious freedom. By extension, I reminded them that primary sources were crucial in evaluating the claims made by advertisers and politicians, both of which would often seek to twist the truth to suit their own agendas. By further extension—and hoping to avoid turning those young people from skeptics into cynics—I reminded them that they needed to evaluate the primary source itself and take into account any conflicts of interest. After all, the makers of Brand X are likely to promote studies that show that their product is more effective than Brand Z, especially when those studies have been planned and paid for by the makers of Brand X.

I had hoped that this kind of education, common in most schools and colleges, would serve the students well in their adult lives and help them to make objective, reasonable, and justifiable decisions on all matters great and small, from the election of presidents to the selection of a new toothpaste. (Is it more important to remove plaque, avoid gingivitis, eliminate bad breath, or have whiter teeth? Honestly, a skeptic can spend hours analyzing the endless varieties of toothpastes offered by each brand these days.) I wanted students to realize the great potential and power they had to sort through all of the distractions and clamor around them in order to get at the truth, even if that earned truth challenged their preconceptions and ideological dispositions. In other words, I wanted them to be open to new information and perspectives, even if this led them to change their minds. After all, the human capacity for change and adaptation is a key aspect of our forward-moving evolution. Yes, I said it— and if you don’t believe in change or evolution, enjoy the view from your rut.

In the end, however, it appears that preconceptions and ideological dispositions still get the best of most of us. On social networks, people on the left post and spread unverified stories and statistics that seem to prove their political points; people on the right post and spread unverified stories and statistics that seem to prove their political points. Some of these are easily proven to be hyperboles, misstatements, or outright lies; others are carefully twisted and redirected versions of the truth (the process we all know as “spinning”). In nearly all instances, however, the posters (and re-posters) risk looking either naïve, misinformed, ignorant, or just plain stupid when the claims are analyzed and the facts made clear.

That risk—and the willingness of so many people to accept it for the most trivial of reasons—fascinates me. Checking the validity of some of these claims and stories takes less than a minute, especially if you’re clued in to sites such as snopes.com and politfact.com. (Disclaimer: These sites are good places to check first, but on their own they are not definitive arbiters of right and wrong. They can, however, provide valuable leads in a search for those coveted primary sources I mentioned earlier.) Despite that, there is still the temptation to hit that “share” button, which takes but a split second and provides that little hit of adrenaline that information junkies seem to crave. After all, if they wait too long, someone else might scoop them and ruin their status as the first (if not primary) source of breaking news. Take that, FoxNews and CNN!

(Topical side note: The death of Whitney Houston on February 11 was a prime example of this. Even before reports had been confirmed, dozens of people raced to post the story as fact on their Twitter and Facebook feeds. Perhaps they all wished to be, if I may twist the New York Times’s catchphrase slightly, the “tweeters of record.”)

In many instances, perhaps, the risk of being wrong is offset by a firm belief that we are right. Even if the statement is later shown to be false, the underlying “truthiness” of it remains. (Thank you, Stephen Colbert, for having introduced that word into the public lexicon. We clearly needed it.) To explain this more plainly, the truth doesn’t need to actually be true as long it truthfully supports what we truly believe to be true. This is why politicians can make wild claims in our legislative chambers (such as the “fact” that Planned Parenthood’s primary mission is to provide abortions) and later claim “no harm no foul” because the easily disproven “fact” was simply alluding to a basic “truth” anyway. (If I had to allow such reasoning in my classes…well, I couldn’t and wouldn’t. It undermines the entire premise of education, not to mention making a travesty of logic and reason. Civilized people simply don’t entertain this kind of idiocy.)

When it comes to believing the unbelievable (or giving credence to the incredible), it’s not just the lies that people tell that intrigue me—it’s also the lies that they hear and subsequently believe or let pass unchallenged. This takes us back to the crux of the reposting problem: people believe and spread untruths because, in some strange way, they WANT the untruths to be true somehow. Testing such a claim would be evidence of weakness or a lack of confidence: If it feels or sounds right to you, it probably is—no further research necessary. In other words, if you’re a cynic, pointing out hypocrisy justifies your cynicism. If you’re paranoid, spreading conspiracy theories justifies your paranoia. If you’re (liberal/conservative), propagating the latest (conservative/liberal) scandal gives you that higher moral standing that you just know your side deserves. If you hate Company A or Team B, then that damning news about Company A or Team B simply MUST be true.

So let’s bring this full circle. Over the years, I’ve come to believe that you can learn a whole lot about a person from the lies he or she tells. Further, you can tell even more about that person from the lies he or she is eager to believe, especially at the risk of reputation and social standing. The more entrenched the belief, the less likely the person is to respond to reason. As a result, dissonance becomes an unavoidable consequence, and with that dissonance comes discomfort, anger, and sometimes violence. In all the hullabaloo, reason and logic are often the first casualties.

Perhaps a first-hand case study would help to illustrate the point. At a town-hall-style meeting a couple of years ago, Senator Bernie Sanders was taking questions from the crowd. One man, most likely a member of the Tea Party, asked him to explain why the government keeps raising his taxes. Sanders pointed out to him that President Obama had just passed tax cuts for the majority of wage-earners, but this man was insistent: His taxes had gone up. Sanders tried to reason with him, pointing out that he appeared to fit the demographic of people who received a tax cut, but no, there was no reasoning with the man. His taxes had gone UP. The discussion, as Sanders subsequently remarked in his typically brusque style, had reached an impasse and could not continue in that forum.

Had the man’s taxes gone up? Probably not. But here’s a pet theory I have about all of this. Had the amount of money being taken out of his weekly or monthly paycheck gone up? That seems likely, especially since many companies have been increasing the employee co-share of medical benefits as health insurance costs continue to rise. The disgruntled man might easily have lumped all of these paycheck deductions under the simple heading of “taxes,” giving him evidence, however misinterpreted, for his claim and belief. And what legislation had been introduced to try to deal with the problem of rising medical insurance costs? Why, the very legislation that members of the Tea Party are dead-set against.

The man at the town hall meeting was angry, and he wanted that anger to be heard and validated. When the facts didn’t serve that purpose, he was faced with a dilemma. He could change his mind, or he could stand his ground.

His final decision: ignore the facts. Therein lies the root of the word ignorance: someone who can be shown the truth yet insist on the lie. Someone who can be told that something is fiction and yet still believe that it’s fact. In this respect, ignorance is a worse trait than naïveté or stupidity.

Just consider the synonyms for the feeble-minded: words like dull and dim. Ignorant people are hardly dull or dim. They carry the full force of their beliefs behind them. Their faith in one unwavering, indisputable view of the world can be both violent and catastrophic. After all, when one person runs through the village with his pants on fire, he or she can spread the flames to every surrounding building. That makes the practice an emergency worthy of public concern.

Luckily, we are all able firefighters. With all this “Information Age” technology around us, the truth is closer to each and every person than it ever has been in the history of mankind. And yet, the most powerful tools can often be used both to create and to destroy. A hammer can help build a hope chest for our dearest possessions, but it can also break a window or kill a man. One post can spark the overthrow of dictators and the rise of democracy, but it can also ruin careers and destroy a country’s economy.

I’ll end this section with a paraphrase of Tim Gunn from the reality show “Project Runway”: “Use the wall thoughtfully.” He was talking about shelves of fashion accessories, but he might just as well have been talking about Facebook. Whatever we do, we should do it thoughtfully, with care and consideration. To do otherwise would be, quite simply, uncivilized.

(NOTE: There is a second and perhaps even third part to this post already underway…)

American Anger, Part One

Preface: This is Part One of what I hope will be an ongoing, potentially year-long exploration of this subject. The topic seems well-suited to the “blog” format, serving more as a catalyst for conversation rather than a definitive treatise on the topic. I look forward to continuing the conversation in hopes of reaching some constructive insights, conclusions, and potential remedies.

As you’ll no doubt quickly note, my take on American anger is a rather personal approach; your choices for taking on the topic may no doubt differ. Despite that, I’ll be using terms like “Americans “ and the first-person-plural pronoun “we” rather liberally throughout the entries. I do this merely as shorthand, fully aware that it’s literary sleight of hand, both a contrivance and a conceit. I don’t intend to suggest that there are absolute universal truths here, especially since the insistence on universal absolutes in society tends to generate the very anger I’ll be analyzing.

As always, thanks for reading, and even more thanks to those who respond to provoke or inspire further insight.

 1. Use Your Words

American anger fascinates me.

Here we are, billing ourselves as the “best, greatest, richest, most powerful” nation in the world, and yet people all over the country claim to be angry. Watching the growth of the Tea Party movement in 2010 was like watching the now-famous scene in Sidney Lumet’s 1976 film “Network” in which unstable talk-show host Howard Beale inspires his viewers to lift up window sashes across the country and shout out into the night: “I’m mad as hell, and I’m not going to take it any more!” Everyone was mad as hell for different reasons, but there was a feeling that bringing all that rage together into one unifying cry might make it either coherent or effective. (Spoiler alert: it didn’t.) In many ways, it echoed a couple of the poet Walt Whitman’s famous lines from “Song of Myself”:

            I, too, am not a bit tamed, I too am untranslatable,

            I sound my barbaric YAWP over the roofs of the world.

It was not a specific word or words that Whitman called out into the night; it was not an intelligible phrase or clause. It was a sound, an utterance, savage and undomesticated, more animal than human. In a way, Whitman was suggesting, people had been making those sounds for years and would continue for many more, well beyond his own eventual death. We might never come to know who he was or what he meant, but discussion about it “shall be good health to you nonetheless.”

In this election year, 2012, we are hearing quite a few YAWPS across the political landscape, some less tamed and translatable than others.

In addition to all the contemporary social and political dissent, there is a perhaps an even more powerful undercurrent of dissonance—the lack of a rational link between one’s beliefs and one’s reality, however either one is perceived. It’s the feeling we get when we pay top-dollar for something only to find that it’s cheaply made or ineffectual. We vote for a candidate based on his or her promises only to find those promises later ignored. (To provide some continuity between this blog and an earlier entry on football’s “Tebow Time” phenomenon, dissonance was that sickening feeling the hyper-religious quarterback’s more fanatic fans experienced when the Denver Broncos were humiliated by the New England Patriots in a recent playoff game. For the sake of divisional fairness, it was also the sickening feeling the Green Bay Cheeseheads felt when Aaron Rodgers and the nearly-perfect Packers succumbed to the New York Giants the very next day.)

I’ll be talking much more about dissonance and its relation to anger later on, but it’s worth mentioning here just to keep the idea in mind as the discussion of anger progresses.

As Americans, we see anger glorified throughout our culture, from movies to music, sports to politics. Despite our supposed Judeo-Christian foundation, we have movements in the country that promote violence and greed over diplomacy and charity. As our young people’s generation comes to define itself (or, to put it in the passive voice, lets itself be defined by others) as “ironic,” it also grows indifferent to irony’s cousin, hypocrisy. Sarcasm provides an easy segue from skepticism to cynicism, providing many a political pundit on both ends of the political spectrum with the equivalent of sniper’s bullets.

When anger wears us down into a numbed state of depression, anger’s inward-turned doppelganger, we shrug our shoulders and try to focus our attention elsewhere. For some, this may translate into another glass of wine, another dose of Xanax, another marathon session watching the Real Housewives of Whatever County spit their venomous barbs at one another. Other folks may start in on the next level of “Angry Birds,” one of the highest-grossing games in our country. Or perhaps you want to take a virtual trip around the world—killing people and blowing things up along the way—in America’s top game of the Christmas season, Call of Duty: Modern Warfare 3. What a wonderful gift to commemorate the birth of the Prince of Peace. (See how easily the sarcasm comes?)

Many players of these games claim that such pastimes are cathartic—that they help “release tension” and “blow off steam” at the end of a stressful day. If that were truly the case, violent movies and first-person shooter games would leave viewers and players in states of blissful repose. Instead, they ramp up the emotions and boost the adrenalin. (Full disclosure: I play an occasional hour or two of “World of Warcraft” myself at the end of a busy day, so I know that to be successful as a warrior, you need to “generate rage.” It’s right there in the game manual.)

So maybe the term cathartic is a canard when we choose violence-based entertainment as a relief or release of our internal anger and frustration. I’d argue that the proper word is indulgent. Pressing further, I’d express concern that a more appropriate adjective might be catalytic. America seems to like things super-sized and hyper-accelerated, so it’s no surprise that when it comes to anger, amplification isn’t just acceptable; it’s preferable.

An admission: cathartic, indulgent, and catalytic are big words. I’m a writer, so I sometimes use big words. That’s because language, like anger, fascinates me. They’re both acts of expression that have rich, sometimes hidden, roots and origins. Example: I wrote a poem about one such instance, the word decimate. Many people think it means “to destroy completely and indiscriminately.” In fact, the word is based on the Latin root for the number ten and originally meant a methodical act of slaughter in which exactly one victim in ten was killed. (Ironic, eh?) The meanings of words may evolve over time, but the origins of their species are there for all to comprehend and appreciate.

But I digress. Let’s return to the notion of anger as a cathartic force and set forth a little thought experiment. Imagine that you’re a parent dealing with a red-faced child whose inexplicable rage has sent cereal, milk, and orange juice flying across the kitchen. To calm the child, would you—

  1. put on some soothing, New Age music and send the child into the corner for a five-minute “time out” period of self-reflection?
  2. tell the kid to march off to his/her room and go the f*ck to sleep?
  3. tell the child to imagine having an automatic weapon in his/her hands during a stressful, high-stakes combat mission whose outcome will determine the fate of all mankind?
  4. ask the child, “Why are you so angry?”

Now imagine America as a red-faced child.

Modern child-rearing gurus recommend option d. Many advise parents to respond to their children’s extreme behaviors with the expression “Use your words.” This doubles as both an encouragement of self-expression and a redirection of energy. It’s a graceful dance step that moves the child away from visceral reaction toward more cerebral creation. Emotions, meet intellect. Intellect, say hello to emotions.

To some, however, “use your words” is just so much poppycock. To quote the blogger MetroDad, a rather literate and opinionated New Yorker: “I think it’s a bullshit mantra that only helps raise the next generation of pussies.” Like it or not, that’s using your words.

In some ways, “use your words” promotes a form of therapy. It seeks to replace the outburst with what we might call the “inburst,” a breaking-and-entering of the psyche in order to see what secrets are hidden in the closets or nailed beneath the floorboards. We ask a child “what’s really bothering you?” with an expectation of stolen snacks or missing pets, but sometimes the answer shocks and surprises. I’d argue that this is true even when we as adults ask the question of ourselves.

It’s no surprise that many people view creative expression as a form of therapy. Just read the inexhaustible output of writers writing about writing, a quite profitable if overindulgent niche market. We’ve even “verbed” the word “journal.” Did you know that people who journal frequently are able to reduce their stress and manage their anger more efficiently? I could say the same thing about blogging, but then there’s that quote up above from MetroDad. (I kid MetroDad. His blog entries are actually quite amusing, entertaining, and even insightful.)

Too often these days, when it comes to using our words, people settle for quick fixes rather than deep introspection. It’s the 140-character Tweet of the daily pet peeve versus Plato’s lifetime of examination. I’m not suggesting that everyone sign up for therapy sessions, but I do ask friends and colleagues to strive for clarity and honesty in their communications. That often requires work. True expression isn’t effortless.

Even as I write this, I am surrounded by reference materials. As a writer, it often isn’t enough simply to “use your words.” As you’ve noticed, I often rely on the words of others, be they expressed in song or psalm, poetry or prose, book or blog. I would be lost without the dictionary, the thesaurus, the atlas, the encyclopedia, and the patient guidance of my editor/husband—even though all of those things can tempt me along time-consuming tangents with their fascinating insights. Likewise, I am inspired and guided by the works of scholars like Geoffrey Nunberg, whose books and NPR spots on language have both educated and entertained me. Honestly, how many of you get excited when you see an essay entitled “The Politics of Polysyndeton” Hands? Hands? Hello?…

My own fascination with language started in second grade, when my wonderful teacher Miss Burke introduced me to bookmaking with the simplest materials, and it has grown deeper ever since. Even so, one catalytic instant stands out. (Please, if you still don’t know what catalytic means, either look it up on your iPad’s dictionary or ask your car mechanic. After all, these elite, ten-dollar words aren’t reserved for professors holed up in their ivory towers. If you truly love your country, learn the English language. Have I made my appeal clear in both liberalese and conservatese?)

Elie Wiesel, winner of the Nobel Peace Prize in 1986 and the holder of a Congressional Medal of Honor, is another humanitarian hero of mine. Wiesel spent most of his life coming to terms with the violence, anger, and despair he witnessed as a concentration camp prisoner during the Holocaust. I heard him speak about his experiences shortly after he received the Nobel Prize. One of his responses during a question-and-answer session has haunted me ever since.

“Americans,” he stated matter-of-factly, “have one of the most violent languages in the world.”

The truth of that comment struck me. No…it hit me in the face. No…it blindsided me. No…it knocked me out. No…it fell on me like a ton of bricks. No…it blew my mind. No…it bowled me over.

Everywhere I went and everyone I talked to—suddenly, I was keenly aware of the insidious presence of anger and violence in everyday American language. On one occasion, I felt compelled to alert a pacifist minister to her repeated use of violent idioms and imagery in a sermon on compassion. She stood there dumbstruck (as we say), amazed by the horrible truthfulness of the comment.

For a while after hearing Elie Wiesel speak, I too felt dumbstruck, “made silent by astonishment” (to quote Webster). As a writer, I also felt aware in a way I had never felt or experienced before. The Buddhist in me smiled silently. Mindfulness, after all, is one of the key concepts of the practice, summed up simply in the popular mantra “Be here now.”

And so here I am, now, in an American culture defined (in part) by its reactionary anger toward so many things—including each other. I’m struggling to understand that anger, both in myself and in others, and to use my words to describe it. But what do we talk about when we talk about anger?

Defining anger, as I hope to demonstrate in the forthcoming part two, is no easy task, but it’s well worth the effort. Our fate as a nation, if I can ramp up the election year rhetoric, may actually depend on it.

• • •

Playlist for “American Anger”

“Music is food,” says my artist-friend James “Mayhem” Mahan, and so this post comes with a playlist for the full multi-media experience. These are songs that fed my mind as I considered this post and its upcoming parts. It’s also collaborative, so if you’re on Spotify, I encourage you to contribute as well as to listen. Mostly it’s for fun…testing once again how all of this interactive interconnected technology works. Enjoy.

  1. Green Day, “American Idiot”
  2. Public Image, “Rise”
  3. Nine Inch Nails, “Terrible Lie”
  4. Kanye West, “Monster”
  5. Florence and the Machine, “Kiss with a Fist”

(You can listen to and help build this playlist on Spotify here:

American Anger)

The Best of 2011…Is Yet to Come

"Opportunity for Reflection"

First of all, happy Gregorian 2012 to everyone!

In this season of endings and beginnings, I’ve been thinking instead about continuity and the hope that it offers us. After all, just a few weeks ago many of us were celebrating the winter solstice, that annual moment when Earth’s perpetual journey around the Sun begins to favor daylight over darkness. We could say with scientific certainty that brighter days were ahead. Ecologically, this is also the time when seeds stir in the earth and prepare for the upcoming growth seasons, even though their first green shoots are still a few months off. We celebrated that cycle of life along with the turning wheel of the seasons—the ongoing recurrence of natural patterns over time.

From the winter solstice, fast-forward a few weeks and the focus shifts to the close of the calendar year, a somewhat arbitrary and historically variable marker. After all, if you so desired, you could celebrate New Year’s Eve throughout the year, as long as you researched all of the lesser-known calendar-flips (Happy Gudi Padwa, everyone!) in addition to the more well-established date-changers, such as Rosh Hashanah and the Chinese Spring Festival. For much of the world, however, the calendar established by Pope Gregory XIII holds sway, making us all followers of the Catholic tradition, if only for a short time. This might explain all of those confessions of guilt and penitent vows of self-renewal associated with New Year’s resolutions. (Religious history purists can make what they want of the fact that January 1 also marks the supposed anniversary of Jesus’s circumcision. Perhaps that explains the noisemaker tradition?)

In western culture, the end of the calendar year has also become a time of retrospective judgment. “Best of” lists vie with “Worst of” lists for our consideration. Many of these seem contrived solely to boost sales at the end of the fourth business quarter (or second, if your company uses the July-to-June model). It’s probably no coincidence that the holiday season segues so seamlessly into the “awards season.”

For a long time, I was a huge fan of year-end best-of lists. Reading them was like sneaking a peek at the teacher’s edition of some cultural textbook: Had I chosen the right movies to watch? Did I memorize the words to the most worthy songs? Would reading the highest-rated books provide clues to help propel my own to the top of the list some day? One of my friends, a film studies major, regularly sent out a detailed report of his top-rated movies from the previous year, and I learned a great deal about cinema while studying his reviews and rationales. For weeks afterward, I sought out the films he had mentioned—no small feat, given the obscurity of some of them and the occasional lack of comprehensible subtitles.

Then, one year during graduate school, it all went sour. A film critic published his “Best of the Year” list in the city’s newspaper. There were just a few slight problems. First of all, he hadn’t screened all of the movies that had been released that year (but then again, who could?). Perhaps more importantly, he confessed that he hadn’t yet seen some of the films topping the box office charts or other critics’ “best of” lists. Furthermore, several of the movies that he mentioned were well over a year old, and the reviewer admitted to having only SEEN them during the course of that calendar year. In short, his list was a sham.

A subsequent exchange of letters between the reviewer and me was quite instructive and forever changed both our minds about end-of-year pronouncements. During our conversation, we noted that a movie often takes years to produce and premiere. The film itself is, in turn, based on a screenplay that may have been written and developed for several years prior to that. By extension, some films are based on pre-existing stories and novels (and, in more recent times, comics and board games). Those original artistic creations themselves might have required years of germination. The date stamped on such a film (or novel or musical composition) masks years of hard work and risks becoming, as dates sometimes do, misleading and meaningless.

Based on these musings, I will go out on a limb and suggest that Jane Austen did not fret over the fact that Pride and Prejudice was not named “Best Novel of 1799,” the year in which she completed the first draft of the manuscript. In fact, she would have to endure another fifteen “not-best-of” years before the book was even published. This should serve as an inspiration to all of us who labor on long-term projects like novels, child-rearing, and the deployment of particle accelerators. Some things just take time. To appropriate T.S. Eliot, those of us who craft lengthy books should measure out our lives with coffee spoons and printer cartridges, not calendar pages.

So, for many who look upon the start of a new year as a time to “take stock and start anew,” I counsel patience and perseverance instead. There is no reason to pause at this specified instant and judge what we did or did not achieve in 2011. Opportunities for reflection will no doubt come in 2012, and we can decide for ourselves which moments and contexts best serve our current endeavors.

In the meantime, here’s looking forward—perhaps far forward—to those future years in which the seeds we planted all these past years bear fruit.