Unimaginable

The 9/11 Memorial always reminded me of a double Bat-signal rising up above Gotham City. (Photo from United States Government Works, public domain)

New York City, September 11, 2001.

After an invigorating morning swim and a walk to the office under clear blue skies, I’m selecting a muffin to go with my morning coffee when a cafeteria worker rushes in from the dining area. Nearly hyperventilating, he tells everyone that a plane has just crashed into one of the World Trade Towers.

We all hustle toward the windows and look to the south. Dense smoke billows out from the irregular shape now punched into the side of the north tower. We grasp at explanations. Our collective imaginations reach this consensus: the pilot of a small plane must have suffered a heart attack and lost control of his aircraft.

Shortly thereafter, in my own department eleven stories higher, I join my coworkers along the south-facing bank of windows. Speculation continues even as a second aircraft appears to the west, coming in low across the Hudson River at high velocity. We theorize that it must be a press-related plane rushing to the scene, but it’s coming in too fast and is too close to the building and, seconds later, slams into the south tower. A fireball erupts as it crashes through the building. The visual force of it shoves us back from the window.

After our many attempts to grasp what has been happening, we now face a more likely yet discomforting fact: We are under attack.

Throughout the morning, as we watch television broadcasts, monitor the Internet, contact friends and relatives, and eventually recoil in horror as the towers collapse one after the other, two dominant yet contradictory themes emerge:

“This is unimaginable.”

“It looks like a scene from a movie.”

Later that afternoon, in our attempts to navigate home through the paralyzed city, my husband and I crowd onto a bus heading uptown, away from the disaster. People sit and stand in silence, some marked as close witnesses by dust, soot, and ash. If we speak at all, we speak in broken whispers. Mostly, we settle into a state of numbness that will haunt us for days, weeks, generations.

When the bus stops to let passengers on and off, I look back toward the plumes of smoke rising above Manhattan. Already I’m exhausted by a sight that will dominate the view from my office window for a period beyond all expectation. I look down at the street, where I see a crumpled and dusty costume being run over repeatedly by buses, cabs, fire trucks, and police cars. I recognize the red-and-blue paneling and its black web motif. It is a Spiderman suit.

Where were the superheroes today? I ask myself.

The answer is obvious. Superheroes are make-believe. They’re fictions, fantasies. Yet somehow, even in the context of such stark realities, they seem entirely imaginable.

• • •

This is unimaginable.

It looks like a scene from a movie.

These comments echoed once again throughout the coverage of the shooting at the recent “Batman” movie premiere in Aurora, Colorado. In this instance, however, the comic-book costume in question was inhabited, come to life.

James Holmes, his hair dyed red like the villain the Joker, a gas mask concealing whatever crazed expression might have been on his face, his body encased in bulletproof armor, fired randomly and repeatedly into a smoke-filled theater and struck 70 people.

In Aurora, the police were quick to apprehend Holmes. With a mixture of luck and skill, they were also able to prevent any further damage that might have resulted from entering his booby-trapped apartment.

Even after capture, Holmes acted like someone “playing a role,” according to one police officer. In other words, he continued to engage in make-believe, pretending to be someone he wasn’t. He was acting out a fantasy.

He was imagining things.

• • •

At an early age, I decided that I could and would make a career out of imagining things. I read the fiction of Ray Bradbury, Italo Calvino, and Donald Barthelme with both appreciation and envy. In order to achieve true and lasting greatness, I would have to dream bigger things than my literary idols and spin from those dreams far greater works of fiction. To do that, I would have to create characters, settings, and situations that no one else had thought of before.

In short, I would have to imagine the unimaginable.

And so, after years spent crafting short stories, novels, and poems, the events of 9/11 left me feeling uneasy and strangely complicit. Hadn’t I created similarly sinister plots for my own antagonists? Truth be told, I saw nothing “unimaginable” about flying an airplane into a building as an act of terrorism. Quite the opposite: It seemed rather simplistic.

In the days and weeks after the attack, I grew annoyed when people in both the media and our government used the word “unimaginable” to describe the events. It seemed a sloppy description, as much a dodge as a denial.

What I found truly unimaginable was that the people in charge of defending against such things hadn’t done their jobs and anticipated this scenario. (In fact, some intelligence agents had, but few people believed enough in their reports to act on them.) I wanted to suggest that the soon-to-be-formed Department of Homeland Security should employ fiction writers like Tom Clancy and Robert Ludlum—any writers, really—to help them out a bit in the imagination department.

The Aurora shooting incident, however, represents the flip side of this argument. Here, a work of imaginative fiction inspired the assailant. In his twisted state of delusion, James Holmes latched onto the Joker, a character created by comic-book writers in the 1940s, and erased the border between fantasy and reality. He obsessed over the Batman mythology but followed the wrong role model.

Authorities made easy comparisons between Holmes’s behavior and Heath Ledger’s portrayal of the Joker in the film “The Dark Knight,” which earned him a posthumous Academy Award. Critics praised the depth of Ledger’s commitment to the role. Sources on the movie set expressed their astonishment at the intensity of Ledger’s investment in the character. In an interview at the time, Ledger himself described the role as “fun,” adding: “There are no real boundaries to what the Joker would say or do. Nothing intimidates him, and everything is a big joke.” (Empire magazine, January 2008)

“Well, I warned him.” According to one report (New York Daily News, January 24, 2008), this was Jack Nicholson’s response on learning about Heath Ledger’s death by overdose a short while later. (Nicholson had taken on the Joker role in Tim Burton’s 1989 film “Batman.”) Nicholson knew the emotional and psychological toll involved in playing the Joker, and Ledger’s subsequent health issues suggested that he had been haunted by the role well after the film had wrapped.

Ledger’s death raises uncomfortable questions for those in the creative arts. Are there some places in the imagination that we shouldn’t go? And even if there are, how do we prevent ourselves from going there? Just because we may refuse to imagine something, does that make it unimaginable?

• • •

“With great power there must also come–great responsibility!”

This insight, currently attributed to Stan Lee and the Spiderman franchise but thought to originate with the French thinker Voltaire, provides a guiding light whenever I have my doubts about writing and the imaginative impulse. It reminds me that any ability, like any tool, can be used for good or bad, to create or destroy. A person can use a hammer to build a home or break a window. Radioactive material can be used to destroy cancerous cells or bring an empire to its knees.

With so much potentially at stake, it becomes tempting to sit out the debates that we need to have—with each other and with ourselves—in the wake of events such as 9/11 and Aurora and countless other acts of violence. In postponing action—or, in some cases, shrugging and offering up a feeble, defeatist “Well, what can you do?” attitude—many of our appointed leaders deny their own power and, as a result, shirk their responsibilities. They fail us, plain and simple.

In the resulting void, opportunists rise to the debate like sharks to bloody chum. Some amplify their pre-existing fears and hatreds, which in turn can incite other acts of violence. (Witness the recent shooting in Wisconsin.) Others cultivate a false sense of security or superiority with cynicism or trendy ironic detachment, a kind of toxic snark. Neither suffices for the level of serious discourse required. Both, in their own ways, risk being irresponsible.

The day before the Aurora incident, I began work on a fictional story about guns and violence inspired by Ishmael Beah’s book Long Way Home: Memoirs of a Boy Soldier. The day after the shooting, my progress stalled. I could imagine what might happen next in the story, but part of me simply didn’t want to any more. The subject felt too overwhelming, and my efforts to address it seemed insignificant. Like young Peter Parker after his uncle’s murder in the Spiderman origin story, I questioned my own powers and abilities. That superhero suit? An empty costume after all.

Slowly, the paralysis subsided. Silence was no longer an option, nor was surrendering to self-doubt or cynicism. I began to write again, and, in doing so, attempted to create the empathic connection that explores all sides of an issue, not just the easy or most comfortable side.

I could not expect a superhero to swoop in and write that next difficult scene for me, just as we cannot leave our hopes for a more just and peaceful world in the hands of some mythical caped crusader. In the real world, our daily challenges remain ours and ours alone. Luckily, we have the power to respond to those challenges in creative and positive ways. If we can imagine ways to channel that power responsibly, then together we can save the world.

Advertisement

Book Value

When I was young, I borrowed most of the books I read from the local library. My public high school provided dog-eared copies of the classics in my college-prep literature classes. To further satisfy my growing reading appetite, I would ride my bicycle downtown to a local hardware store that kept a few ramshackle shelves of coverless remaindered paperbacks near the front door. For forty cents each (or three for a buck), I could pedal home with a combination of Mark Twain’s Adventures of Huckleberry Finn, Ray Bradbury’s Something Wicked This Way Comes, and the not-quite-latest collection of greatest Mad magazine gags (stupid answers to stupid questions, anyone?).

To this day, I continue to haunt second-hand bookshops, yard sales, and the remainder bins at big-box stores in search of bargains. At the end of the semiannual book sale at our local library, I came home with two brown paper shopping bags full of wonderful finds, including the collected poems of Stanley Kunitz, Anne Lamott’s Traveling Mercies, Chang-Rae Lee’s The Surrendered, Tab Hunter’s Confidential, and a couple of titles by Bill O’Reilly (stupid answers to stupid questions, anyone?). I am not exaggerating when I tell you that the bookcases in which those books now reside cost substantially more than the books themselves.

In each one of the instances noted above, my reading experience provided absolutely no compensation to the authors of the books themselves.

This is why I wince a little every time fellow lovers of literature rhapsodize about the beauty and allure of used bookshops. I cringe a bit when poets, novelists, and journalists extol the value and necessity of public libraries. I wince and cringe with the heart of a co-conspirator, for I, too, value and support these institutions. And yet by patronizing both of them, readers rob the arts to a greater extent than purchases made at the Amazons and Barnes & Nobles of the world. Access to free, remaindered, or used books may boost readership, but each one deprives writers of compensation for their work. Seen another way, they literally lower the value of the written word.

I learned at an early age that writers should expect little, if anything, in the way of recognition or remuneration for their efforts. Mostly one should expect rejection—either in the form of “thank you, but not for us” letters from potential publishers or “nice try, but not for me” comments from readers. Many literary writers still subscribe to the somewhat romantic “starving artist” archetype, envisioning themselves in tiny, unheated lofts subsisting entirely on grilled cheese sandwiches, flat soda, and watered-down tomato soup as they churn out page after page of unappreciated brilliance.

One might expect that things would have improved over the past few decades, but they haven’t, at least not in the publishing industry. In fact, the situation has become even more psychologically unhealthy. These days, many magazines and journals require hopeful writers to pay a “reading fee” for the opportunity to be considered for publication. In a number of instances, specious contests have replaced the normal submission process. The literary world has become more like a lottery in more ways than one. What was once called the “table of contents” in some books and journals might now be better referred to as a “list of winners.”

Editors and publishers will tell you that reading fees and contest are necessary evils in order to meet their financial demands. After all, there are bills to pay: editorial staff, office assistants, designers, printers, distributors, and so on. On top of that, one must pay for electricity, water, heat, telephone, and so on.

Now consider that for many literary journals, the person who provides the actual content, the writer, is often paid very little, if at all. Despite having potentially invested in college tuition, workshop or residency costs, and all those reading/entry fees, publication rarely means “hitting the jackpot.” A poem that took fifteen to twenty hours to write might net ten dollars. A historical piece that required months of research, writing, and revising might reward you with three or four copies of the obscure journal that accepted it.

With that in mind, you might wonder why anyone in his or her right mind would pursue a literary career in the first place. Perhaps this explains why so many writers place their full and earnest faith in the metaphorical value of writing. In this scenario, writers are akin to mystics who channel universal truths onto the page. This probably explains why some writers’ conferences can seem like cults to non-bookish outsiders—and why some attendees can come across as desperate martyrs. I’m reminded of a lighthearted quip a fellow writer once told me on sharing his latest piece: “I suffer for my art; now you can suffer, too.”

I worry that this kind of attitude—this self-indulgent elevation of “literature”—further devalues the entire creative enterprise. It fosters a disconnect between the writer on high and the lowly reader, which in turn leads to dwindling book sales and meager showings at readings. Many of us who remain committed to the written word nonetheless felt a twinge of recognition and understanding when someone posted on Facebook: “April is National Poetry Month, that time of year when we should all force ourselves to attend dozens of readings and pretend to be enjoying ourselves.”

So you see, in some ways I’m still a terrible supporter of the arts. I read the latest articles about the financial struggles in the publishing world and can’t escape the sinking feeling that, in some ways, we brought this on ourselves. For decades, writers have accepted disproportionate compensation for their work without much complaint. As readers, we became more and more accustomed to paying little—if anything—for the works we’ve consumed. And in a bizarre twist of hypocrisy, we voiced our strongest support for those institutions that hurt writers the most. With that in mind, the crumbling nature of the publishing world today shouldn’t surprise anyone.

Still, I hold onto a shred of optimism about the future. Here are a few final thoughts and suggestions, and I welcome others:

  • Don’t be deluded—and I mean that literally, not figuratively. Stop buying into harmful myths like the “starving artist” I mentioned earlier. Demand that writers be paid for their work, and if it has to be a small amount, at least ensure that it’s not an insulting or demeaning amount. If a magazine or journal can come up with a business plan that covers the costs of printing and production, it should also account for the value of the content itself—the author’s compensation.
  • Do what you can to help publishers meet these increased financial challenges by subscribing to those magazines and journals whose work you respect and admire. (I know, I know: Once again, it’s up to the writers to cough up more money in the hopes of supporting their own potential paychecks. Maybe you can get friends and family members to subscribe as well, or convince some rich benefactor to bequeath millions to the journal in question.)
  • Support your local library not only by checking out books but by getting out your checkbook. That way they can purchase more books and journal subscriptions.
  •  Realize that free public readings aren’t really “free” and support both the presenter and the host bookstore by purchasing book(s).
  •  Enjoy your bargain book from the remainder bin, but support the writer with a full-price purchase of another title if you liked his or her work.
  •  Most of all, value what you do, whether you read or write. If literature does indeed have worth, then we should all be ready and willing to pay for it.

“The Normal Worldview”

Print by Catherine Rondthaler. More of her stunning work can be seen at http://catherinerondthaler.com/_MG_6169.html.

This week, two separate news stories caught my attention along with the rest of the country’s: The retraction by “This American Life” of Mike Daisey’s fabricated account of Chinese laborers making Apple products, and the conviction of Rutgers student Dharun Ravi in relation to the suicide of his gay roommate, Tyler Clementi. Both stories, it seems, ignited quick responses in those who heard/read/saw them, and yet both stories have deep, deep backstories that complicate them well beyond the comfort zones of most casual media consumers. Each has the capacity to be told as an epic novel, and yet many people have formed their opinions—some quite strong and unshakable—based on the equivalent of a short story or a Wikipedia synopsis.

(At the risk of hypocrisy, I feel obliged to provide a brief overview here for those who are unfamiliar with the cases. If you are familiar with them, my apologies; sorry to interrupt; skip ahead to the next paragraph, please. Links also appear at the bottom of this entry for those seeking more information. 1) Mike Daisey, a theatrical monologuist, had appeared on Ira Glass’s NPR show “This American Life” to offer his supposedly first-hand accounts of the terrible working conditions at a massive Chinese factory that produced Apple’s iPad. After the show received its strongest listenership to date, the producers discovered that many details in Daisey’s account were false, at which time they retracted the story. They devoted a follow-up episode to the incident, during which Daisey offered an awkward defense of his position. 2) In the Rutgers case, Ravi was convicted of bias-motivated invasion of privacy—a hate crime, in other words—after using a Webcam to spy on his roommate, Clementi, during a romantic encounter and sharing some of the images and his thoughts with friends. Clementi later jumped from the George Washington Bridge, though his reasons remain either unknown or undisclosed. The suicide note he left behind has not, as of this writing, been released. Subsequent accounts of the story’s details had been altered in order to portray Ravi as a homophobic bully who served as the catalyst for Clementi’s suicide.)

Without question, these two stories are charged. They touch nerves, raise passions, and spark debate. When you look closely, there is no easy way to “read” either one, and yet their appeal seems to stem from just that: quick and often preconceived judgments. The stories have been appropriated and exploited in many ways for a number of agendas: corporate negligence, workers’ rights, cyber-bullying, gay harassment. In some ways, they have been revised and rewritten to fit theme-based narratives, and somewhere along the way, for whatever reasons, the facts gave way to fiction.

At the same time, two essays about writing recently appeared in The New York Times: Annie Murphy Paul’s “Your Brain on Fiction” and Jhumpa Lahiri’s “My Life’s Sentences” (links below). The first draws its inspiration from recent studies of the human brain and how it responds to figurative language (not necessarily fiction, despite the emphasis in the title); the second is a more rhapsodic account of the writing process by a Pulitzer-prize-winning author. The first seeks insight and evidence gathered through scientific research; the second in mystical experience relayed through a series of often contradictory metaphors (“Sentences are the bricks as well as the mortar, the motor as well as the fuel. They are the cells, the individual stitches. Their nature is at once solitary and social.”). Both tend toward an elevation of creative writing as something beyond the realm of the ordinary or commonplace, a glorification that was subsequently approved and forwarded by my many literary Facebook friends. Both were also listed among the top ten e-mailed articles on the day of and the day after their publication.

To provide some context, I’ve been thinking about these four examples (Daisey/Ravi/Paul/Lahiri) as I read through hundreds of fiction and nonfiction manuscripts submitted as applications to the Bread Loaf Writers Conference, a task I’ve undertaken for well over a decade now. Much of this writing comes from published authors; some are current teachers of creative writing at prestigious colleges. There is always an implicit tension between the fiction and nonfiction camps that can best be summed up in two competing bits of wisdom. The first, paraphrased by the poet Lord Byron in Don Juan, states, “’Tis strange—but true; for truth is always strange; Stranger than fiction.” The second, included in a moment of critical reflection within Dostoevsky’s novel The Idiot, muses, “Authors, as a rule, attempt to select and portray types rarely met with in their entirety, but these types are nevertheless more real than real life itself.”

The two editorial pieces in The New York Times would seem to support these statements. I have also heard both statements used in reference to accounts of the Daisey and Rutgers stories. In both instances, political themes apparently trumped a proper respect for the facts of the matter.

Truth can be stranger than fiction. Fiction can be more real than real life itself. These two statements create something of a logical conundrum. Can truth be stranger than truth itself? No wonder the fake television talk-show host Steven Colbert felt compelled to popularize the concept of “truthiness.” We should remember, however, that each of these thoughts and ideas arose within the creative realm. They are the musings of fictionalized characters created by a poet, a novelist, and a comedian. Even though many might consider them to be simple aphorisms and adages, they are often considered gospel within the literary worldview.

The uncomfortable thing about aphorisms and adages is that society tends to be two-faced about them. We favor one when it suits us, then favor the opposing view when it seems more appropriate—or, in this context, “truer.”

Consider the following: Absence makes the heart grow fonder. Ah yes, true. But then: Out of sight, out of mind. So true. A stitch in time saves nine./Haste makes waste. Actions speak louder than words./The pen is mightier than the sword. You get the idea. You see little sayings like this posted all the time on Web sites and Facebook updates, and more often than not, a chorus response of “So true!” will follow.

A reliance on simple sayings like these—as well as simplistic sayings disguised as profound wisdom, of which there are many more examples—tends to create a rather tenuous and fragile worldview. That’s why it shocks us so deeply when someone like Mike Daisey comes along and throws rocks inside the glass house. (Sorry for that pun. I couldn’t resist.) Daisey offered up a fiction as fact, and people wanted to believe him so badly that they willingly suspended their disbelief. (Yes, that “willing suspension of disbelief for the moment, which constitutes poetic faith,” as Samuel Taylor Coleridge once wrote. More literary gospel.) Ira Glass even admitted that because the details of Daisey’s story seemed so true, the fact-checkers gave him a pass when they couldn’t corroborate his claims with the interpreter/character mentioned in his monologue (whose name he had lied about and whom he claimed could not be reached, when in fact she was easily contacted by another journalist).

Likewise, a piece of investigative journalism by Ian Parker in The New Yorker (link below) calls into question any simplistic understanding of the Ravi/Clementi case. As a piece of “literary” writing (which I’ll simplify here to mean writing that includes effective characters, setting, plot, and theme), the article surpassed nearly all of the fiction I have read over the past years. It stood as an example of why, over the past decade, my own “poetic faith” in literature has been eroding and I have been unwilling to suspend my disbelief in most (though assuredly not all) fictional contexts.

I have, in essence, become a literary skeptic. Inasmuch as our artistic culture constantly blends the two (consider the current use of the term “creative nonfiction”), I teeter on a tightrope when I read. I look to the author to provide a steady, unswerving wire along with a pole for balance, and when I feel that I have these things, I am able to walk between the towers of fact and fiction, a feat accomplished to great effect in Colum McCann’s extraordinary novel Let the Great World Spin. (Just read the opening pages and see for yourself.) Mostly, perhaps, I look to the author for honesty and authority, two traits that don’t always play well with one another.

In searching for the most powerful voice—for a “totality” of fabricated details when the truth was more than sufficient to drive his point home—the author/writer Mike Daisey created a narrator/character, also named Mike Daisey, who twisted the facts. Then, like many contemporary authors before him, he dared us, the audience, to tell the two apart. The only problem here was that he did not tell his audience that he was playing a game of “Truth or Dare.” In fact, he never hinted that we were playing a game at all.

As the adages above demonstrated, you can’t have it both ways. Attempts to do so only demean both sides of the equation: the essential integrity of journalism (here comes more fodder for the “lamestream media” claim) and the transformative power of creative literature (here comes more fodder for the “academic elitists” claim). No doubt the dialogue between the two camps (literature and journalism, art and reality, fact and fiction) can and should continue, but it needs to be honest. It should welcome scrutiny rather than resist critical efforts. In that way, we can preserve spaces for both the brain-based analysis of figurative language as well as the mystical creation claims recently published in The New York Times. Otherwise, they risk coming across as mere parlor tricks promoted by charlatans.

Below is an excerpt from the transcript of the retraction episode of “This American Life” (link also below). I don’t include it as a way of providing closure, however. This discussion needs to continue beyond its most recent manifestations, just as it continued beyond the now historical cases of James Frey’s “creative memoir,” A Million Little Pieces, and the falsified news reports of another Glass, Stephen, in the 1990s. For me, however, the takeaway from the excerpt below is the title of this blog entry itself, “The Normal Worldview.” What great fodder for a million more little blogs…

 Ira Glass: Like, you make a nice show, people are moved by it, I was moved by it, and if it were labeled honestly, I think everybody would react differently to it.

Mike Daisey: I don’t think that label covers the totality of what it is.

Ira Glass: That label – fiction?

Mike Daisey: Yeah. We have different worldviews on some of these things. I agree with you truth is really important.

Ira Glass: I know, but I feel like I have the normal worldview. The normal worldview is when somebody stands on stage and says ‘this happened to me,’ I think it happened to them, unless it’s clearly labeled as ‘here’s a work of fiction.’

RELATED LINKS:

“This American Life” original episode (transcript):

http://www.thisamericanlife.org/radio-archives/episode/454/transcript

“This American Life” retraction episode:

http://www.thisamericanlife.org/radio-archives/episode/460/retraction

News article about the Dharum Ravi conviction:

http://www.nytimes.com/2012/03/17/nyregion/defendant-guilty-in-rutgers-case.html


“The Story of a Suicide” (Parker):

http://www.newyorker.com/reporting/2012/02/06/120206fa_fact_parker

“Your Brain on Fiction” (Paul):

http://www.nytimes.com/2012/03/18/opinion/sunday/the-neuroscience-of-your-brain-on-fiction.html?src=me&ref=general

“My Life’s Sentences” (Lahiri):

http://opinionator.blogs.nytimes.com/2012/03/17/my-lifes-sentences/?src=me&ref=general

Liar, Liar, Pants on Fire, Hang Them Up on a Telephone Wire

"2 Men Looking On" from the "Perched" series by artist Jodi Chamberlain. Her multimedia work can be found at http://www.jodichamberlain.com (just click on the painting). Copyright by Jodi Chamberlain; used with permission of the artist.

By way of an introduction, I did some Internet research on the origin of the expression “Liar, Liar, Pants on Fire, hang them up on a telephone wire” and found this reference to an 1810 poem by the English writer William Blake:

 The Liar

Deceiver, dissembler
Your trousers are alight
From what pole or gallows
Shall they dangle in the night?
 
When I asked of your career
Why did you have to kick my rear
With that stinking lie of thine
Proclaiming that you owned a mine?
 
When you asked to borrow my stallion
To visit a nearby-moored galleon
How could I ever know that you
Intended only to turn him into glue?
 
What red devil of mendacity
Grips your soul with such tenacity?
Will one you cruelly shower with lies
Put a pistol ball between your eyes?
 
What infernal serpent
Has lent you his forked tongue?
From what pit of foul deceit
Are all these whoppers sprung?
 
Deceiver, dissembler
Your trousers are alight
From what pole or gallows
Do they dangle in the night?

If you do your own Internet research (and of course you do; you’re probably verifying this on Google right now), you’ll find numerous references to this Blake poem, some of them in fairly reliable places. There’s only one small problem: William Blake didn’t write it.

Additional research has yet to turn up the actual author of the poem or its place and date of origin. I did find, however, that the word “alleged” had at one time been attached to the claims of Blake’s authorship. Some of those who later reposted the poem and its origin myth carefully edited out the word “alleged,” perhaps to boost their own authority. I’ll leave it to the Blake scholars to hash out why this could or could not possibly be an obscure verse from the poet who gave us “Tyger tyger, burning bright”—which, when you say it loud, does share some strong poetic similarities with “Liar, liar, pants on fire” after all.

• • •

I am constantly amazed and dismayed by the number of people who both post and spread unproven statements and “facts” on the Internet. Quite a few of these people are close friends, many of them remarkably intelligent people. Even so, they often clamor to be the first ones to expose some new controversy or conspiracy via Facebook or Twitter. Maybe it’s the journalist in all of us, desperate for a scoop, but if there’s one thing I learned while staying with my ailing father and watching endless episodes of Judge Judy, Judge Jeanine, and Judge Joe Brown, it’s that hearsay is irrelevant in a court of law.

Let’s stop a moment to take a look at that word, hearsay. We hear something, then we say it. From ear to mouth in an almost direct line that bypasses the brain. It’s a key ingredient of soap operas and celebrity gossip, yet it also sadly infects the serious discussions of the day’s major issues. Perhaps in this technological age we need new words for hearsay, such as readshare or seetweet. “I read those statistics on someone’s Facebook wall, so I immediately ‘shared’ them by reposting on my wall.” “I saw a sports commentator on television talking about that big trade that might happen, and so I tweeted that it actually had because he sounded so convincing.”

Another classic example is the Reverend Martin Luther King, Jr. quote that went viral immediately after the killing of Osama bin Laden. Here’s the sentence in question: “I mourn the loss of thousands of precious lives, but I will not rejoice in the death of one, not even an enemy.” I have no problem with the quote itself, but King never said it. All credit is due to Jessica Dovey, who prefaced an actual King quote with that sentence. In its original form, she even set her own thought outside of the actual quotation, clearly distinguishing her words from his. (The teacher in me can’t help but point this out: Punctuation matters!) As the quote ricocheted around the Internet, however, the quotation marks got knocked off and the two speakers’ sentiments became one. Since the Internet never forgets, there are now thousands—if not millions—of lingering misattributions.

After this incident, I began to wonder just how many of those inspirational/motivational “posters” that people share contain actual quotes from the the pictured sources. With today’s technology, anyone with a basic knowledge of photo editing can attribute just about anything to anyone and fool millions in the process. Just look at all those often-shared fake Ryan Gosling quotes. It’s all in good fun…until someone starts attributing hateful or hurtful language to him. (Chances are probably pretty high that someone already has.) Then what started as parody or satire slides uncomfortably across a blurry line into libel and slander.

I know I come across as a killjoy here, but on matters like this, I reveal my true colors as a classic skeptic. I reserve judgment until I can ascertain the facts of the matter, and I remain open to the possibility that objective reasoning may lend credence and support to multiple sides of an argument, even those that might contradict my own previously held beliefs. Like William Blake and many other artists of the Romantic era, I seek a direct link to the primary source—though, devout skeptic that I am, I question his subsequent claims to a mystical intimacy with a supremely divine being. Perhaps his position in literary history—at the crossroads of the Enlightenment (or Age of Reason) and the Romantic era (or Age of Emotion)—describes my own inner identity crisis as a writer. I would also argue that it describes the manner in which America currently teeters as a political force in the world, and our casual (not causal) relationship with the truth seems to be a reliable measure of that tension.

As an educational writer, I am often governed by a remarkably strict and lengthy set of mostly objective state standards and guidelines, even as I am asked to write fairly sentimental pieces, such as articles about developmentally challenged children overcoming adversity. Age of Reason, meet Age of Emotion. For a recent series of writing assignments, I was required to provide two credible and reliable sources for each sentence (yes, each sentence) in nonfiction articles. When and where possible, I was to cite the primary source—in other words, an eyewitness account or an original text, not some second-hand or third-hand citation or discussion. After all, elementary school standards (i.e. grades one through six) stress the significance and importance of primary sources in research. This is not “elitist, ivory tower” graduate-school-level academia stuff.

In my own college-level classrooms, I likewise impressed upon students the necessity of primary sources in validating claims, especially as we grappled with sensitive and controversial issues such as abortion, capital punishment, and religious freedom. By extension, I reminded them that primary sources were crucial in evaluating the claims made by advertisers and politicians, both of which would often seek to twist the truth to suit their own agendas. By further extension—and hoping to avoid turning those young people from skeptics into cynics—I reminded them that they needed to evaluate the primary source itself and take into account any conflicts of interest. After all, the makers of Brand X are likely to promote studies that show that their product is more effective than Brand Z, especially when those studies have been planned and paid for by the makers of Brand X.

I had hoped that this kind of education, common in most schools and colleges, would serve the students well in their adult lives and help them to make objective, reasonable, and justifiable decisions on all matters great and small, from the election of presidents to the selection of a new toothpaste. (Is it more important to remove plaque, avoid gingivitis, eliminate bad breath, or have whiter teeth? Honestly, a skeptic can spend hours analyzing the endless varieties of toothpastes offered by each brand these days.) I wanted students to realize the great potential and power they had to sort through all of the distractions and clamor around them in order to get at the truth, even if that earned truth challenged their preconceptions and ideological dispositions. In other words, I wanted them to be open to new information and perspectives, even if this led them to change their minds. After all, the human capacity for change and adaptation is a key aspect of our forward-moving evolution. Yes, I said it— and if you don’t believe in change or evolution, enjoy the view from your rut.

In the end, however, it appears that preconceptions and ideological dispositions still get the best of most of us. On social networks, people on the left post and spread unverified stories and statistics that seem to prove their political points; people on the right post and spread unverified stories and statistics that seem to prove their political points. Some of these are easily proven to be hyperboles, misstatements, or outright lies; others are carefully twisted and redirected versions of the truth (the process we all know as “spinning”). In nearly all instances, however, the posters (and re-posters) risk looking either naïve, misinformed, ignorant, or just plain stupid when the claims are analyzed and the facts made clear.

That risk—and the willingness of so many people to accept it for the most trivial of reasons—fascinates me. Checking the validity of some of these claims and stories takes less than a minute, especially if you’re clued in to sites such as snopes.com and politfact.com. (Disclaimer: These sites are good places to check first, but on their own they are not definitive arbiters of right and wrong. They can, however, provide valuable leads in a search for those coveted primary sources I mentioned earlier.) Despite that, there is still the temptation to hit that “share” button, which takes but a split second and provides that little hit of adrenaline that information junkies seem to crave. After all, if they wait too long, someone else might scoop them and ruin their status as the first (if not primary) source of breaking news. Take that, FoxNews and CNN!

(Topical side note: The death of Whitney Houston on February 11 was a prime example of this. Even before reports had been confirmed, dozens of people raced to post the story as fact on their Twitter and Facebook feeds. Perhaps they all wished to be, if I may twist the New York Times’s catchphrase slightly, the “tweeters of record.”)

In many instances, perhaps, the risk of being wrong is offset by a firm belief that we are right. Even if the statement is later shown to be false, the underlying “truthiness” of it remains. (Thank you, Stephen Colbert, for having introduced that word into the public lexicon. We clearly needed it.) To explain this more plainly, the truth doesn’t need to actually be true as long it truthfully supports what we truly believe to be true. This is why politicians can make wild claims in our legislative chambers (such as the “fact” that Planned Parenthood’s primary mission is to provide abortions) and later claim “no harm no foul” because the easily disproven “fact” was simply alluding to a basic “truth” anyway. (If I had to allow such reasoning in my classes…well, I couldn’t and wouldn’t. It undermines the entire premise of education, not to mention making a travesty of logic and reason. Civilized people simply don’t entertain this kind of idiocy.)

When it comes to believing the unbelievable (or giving credence to the incredible), it’s not just the lies that people tell that intrigue me—it’s also the lies that they hear and subsequently believe or let pass unchallenged. This takes us back to the crux of the reposting problem: people believe and spread untruths because, in some strange way, they WANT the untruths to be true somehow. Testing such a claim would be evidence of weakness or a lack of confidence: If it feels or sounds right to you, it probably is—no further research necessary. In other words, if you’re a cynic, pointing out hypocrisy justifies your cynicism. If you’re paranoid, spreading conspiracy theories justifies your paranoia. If you’re (liberal/conservative), propagating the latest (conservative/liberal) scandal gives you that higher moral standing that you just know your side deserves. If you hate Company A or Team B, then that damning news about Company A or Team B simply MUST be true.

So let’s bring this full circle. Over the years, I’ve come to believe that you can learn a whole lot about a person from the lies he or she tells. Further, you can tell even more about that person from the lies he or she is eager to believe, especially at the risk of reputation and social standing. The more entrenched the belief, the less likely the person is to respond to reason. As a result, dissonance becomes an unavoidable consequence, and with that dissonance comes discomfort, anger, and sometimes violence. In all the hullabaloo, reason and logic are often the first casualties.

Perhaps a first-hand case study would help to illustrate the point. At a town-hall-style meeting a couple of years ago, Senator Bernie Sanders was taking questions from the crowd. One man, most likely a member of the Tea Party, asked him to explain why the government keeps raising his taxes. Sanders pointed out to him that President Obama had just passed tax cuts for the majority of wage-earners, but this man was insistent: His taxes had gone up. Sanders tried to reason with him, pointing out that he appeared to fit the demographic of people who received a tax cut, but no, there was no reasoning with the man. His taxes had gone UP. The discussion, as Sanders subsequently remarked in his typically brusque style, had reached an impasse and could not continue in that forum.

Had the man’s taxes gone up? Probably not. But here’s a pet theory I have about all of this. Had the amount of money being taken out of his weekly or monthly paycheck gone up? That seems likely, especially since many companies have been increasing the employee co-share of medical benefits as health insurance costs continue to rise. The disgruntled man might easily have lumped all of these paycheck deductions under the simple heading of “taxes,” giving him evidence, however misinterpreted, for his claim and belief. And what legislation had been introduced to try to deal with the problem of rising medical insurance costs? Why, the very legislation that members of the Tea Party are dead-set against.

The man at the town hall meeting was angry, and he wanted that anger to be heard and validated. When the facts didn’t serve that purpose, he was faced with a dilemma. He could change his mind, or he could stand his ground.

His final decision: ignore the facts. Therein lies the root of the word ignorance: someone who can be shown the truth yet insist on the lie. Someone who can be told that something is fiction and yet still believe that it’s fact. In this respect, ignorance is a worse trait than naïveté or stupidity.

Just consider the synonyms for the feeble-minded: words like dull and dim. Ignorant people are hardly dull or dim. They carry the full force of their beliefs behind them. Their faith in one unwavering, indisputable view of the world can be both violent and catastrophic. After all, when one person runs through the village with his pants on fire, he or she can spread the flames to every surrounding building. That makes the practice an emergency worthy of public concern.

Luckily, we are all able firefighters. With all this “Information Age” technology around us, the truth is closer to each and every person than it ever has been in the history of mankind. And yet, the most powerful tools can often be used both to create and to destroy. A hammer can help build a hope chest for our dearest possessions, but it can also break a window or kill a man. One post can spark the overthrow of dictators and the rise of democracy, but it can also ruin careers and destroy a country’s economy.

I’ll end this section with a paraphrase of Tim Gunn from the reality show “Project Runway”: “Use the wall thoughtfully.” He was talking about shelves of fashion accessories, but he might just as well have been talking about Facebook. Whatever we do, we should do it thoughtfully, with care and consideration. To do otherwise would be, quite simply, uncivilized.

(NOTE: There is a second and perhaps even third part to this post already underway…)

Saying Goodbye to Santa Claus

Spoiler Alert: Santa’s “Big Secret” revealed in this blog entry.

Exhibit One: Me with Santa Claus at home in the 1960s, proof positive that he exists.

Now that Santa has flown in, tucked gifts under trees both hither and yon, and headed back to the North Pole for some well-deserved R&R, I feel it’s time to take a look at one of America’s biggest myths and think about how it may have affected us as a nation…or not.

But first, in the spirit of the holiday season, I offer a nostalgic visit to my hometown in Massachusetts circa 1970. Picture plastic candles in each street-facing window and a lacquered pinecone wreath adorned with a festive red felt bow on the front door. If you peer in through the spray-on snow frosting the windows, you can see me carefully filling a plastic garbage bag with dozens of gifts. My parents watch, slightly puzzled but mostly silent, as I pull on my snow boots and mittens, then leave the house, bag slung over my shoulder.

A week or so earlier, I had learned a shocking truth that rocked my little world—a secret that had been kept by nearly every adult I had ever met. They had lied to me, these adults. People whom I had trusted entirely, including the local minister and my own parents, had taken part in an international conspiracy and perpetrated a myth, a fantasy, a fiction. The story included a conveniently distant setting, a saintly protagonist (whom I had met in person on several occasions), and a desirable plotline that evoked grand themes of peace, good will, and generosity. To cover their tracks, my parents had even planted evidence: sleigh bells jangled as sound effects in the wee hours of Christmas morning; cookie crumbs and half-drunk tumblers of milk left on the metal TV table set up alongside the chimney.

All of this was an elaborate scheme that blurred the lines between fiction and nonfiction, between fantasy and reality. Young and gullible, I was easily duped. Of course there was a Santa. Of course reindeer flew. Why even question the physics of how, in one single night, a rather rotund man could pilot a craft to every single household around the world and leave presents for all the good boys and girls—and still have time to toss back some cookies and sip some milk in each abode?

In asking me to believe in such fantastic things, my parents taught me an important lesson that would be vital to my budding literary ambitions: how to suspend disbelief. In doing so, however, they taught a corollary lesson: how to suspend belief. In other words, in order to suspend my disbelief in Santa Claus, I also had to suspend my belief in many of the lessons learned in grade school (science, geography, math, etc.).

In some ways, then, the revelation that Santa Claus was a fabrication probably came as something of a relief to me. The dissonance between fantasy and fact, between what I was being told to believe and what I was learning to be true, lessened. That psychological summary may be a bit too deep to ascribe to an eight-year-old’s consciousness, so let me state it another way: Santa or no, the presents were still there on Christmas morning, and so all was well with the world.

Luckily for me, my parents didn’t serve up the revelation about Santa Claus with a simple “Sorry, kid, but that’s just the way it is.” They discussed the importance of symbolism and how this extended to the Santa myth, claiming that while Santa himself may not be real, the spirit of giving that he represents lives on in the hearts and souls of all those who have heard his story. Any fan of the famous “Yes, Virginia, There Is a Santa Claus” newspaper editorial might have accused my parents of plagiarism, but I could tell they were sincere.

Still: Such power in a fictional tale! Suddenly, my dreams of becoming a fiction writer one day became a vastly more important, almost religious endeavor. See how the power of a story, even a fictional story like Santa Claus, could have such great positive effect on the real human world!

And so I headed off into the night with my makeshift Santa sack. Inside I had placed carefully wrapped toys and books for the kids, and on the second and third winters’ visits, some ribbon candy for the adults in each household. I carried on the tradition until, one year, something unexpected happened. Some families had wrapped and readied gifts and treats for me. Somewhat embarrassed by their assumption that I expected something in return, I ended the Christmas Eve tradition that same year.

For years, I forgot about this bit of personal history. I was recently reminded of it by an article about a Vermont teacher accused of being unprofessional and irresponsible for spilling the beans about Santa in a fifth-grade classroom. The teacher had asked students to list names of famous people in American history. In order to keep the lesson focused on facts, the teacher felt compelled to leave figures such as Winnie the Pooh, Harry Potter, and Santa Claus off the list. (I could not tell from the article if she allowed the also-mentioned Jeff Foxworthy and Justin Bieber to remain on the list, but that’s another discussion for another time.)

(The full article is here:

http://www.reformer.com/ci_19576402?IADID=Search-www.reformer.com-www.reformer.com)

The mother who raised the “unprofessional” and “irresponsible” charges against the teacher went on to say that teaching about Santa Claus was like teaching about religion: the topic is best set aside with recommendations to ask one’s parents about such things. That seems fair enough…until I thought about the goals of education in general.

Since a good part of my day job (writing and editing educational materials) relies on the various state standards developed by school boards (many of them quite conservative) around the country, I know that “learning to distinguish between fantasy and reality” is a pretty important benchmark in the lower grades. (Keep in mind that the instance noted above took place in a fifth-grade classroom.) In other words, children are required to differentiate between nonfiction and fiction (fairy tales, myths, legends, and the like). Teachers are required to provide students with the skills and strategies to do this. By fifth grade, then, your average American student should have the reasoning skills to figure out the Santa thing on her/his own. Any parent who disagrees risks spotlighting their children as slow learners—perhaps along with themselves.

According to research done by psychiatrists at Ithaca College and Cornell University in the 1990s, the average American child learns the truth about Santa at age 7 1/2. However, after interviewing 500 elementary-school children, they discovered that “Many children kept up the charade after they knew the truth…because they did not want to disappoint their parents.”

Parents, take a moment to reflect one the meaning of that last clause (no pun intended). Your kids may be duping you into believing that they still believe in Santa. I think back on my own behavior as a pseudo-Santa and wonder if that was, in some warped way, an effort to turn the lies my parents had told me into truths…ergo, my parents had not lied to me after all.

Further, Dr. John Condry, one of the authors of the Ithaca/Cornell study, reported, “Not a single child told us they were unhappy or upset by their parents having lied about Santa Claus. The most common response to finding out the truth was that they felt older and more mature. They now knew something that the younger kids didn’t.”

(You can read more about the study here:
http://www.nytimes.com/1991/11/21/garden/parent-child.html?pagewanted=2&src=pm)

This finding surprised me. “Not a single child”? Parents, take another moment to think about telling your child that he or she cannot have a toy or candy bar that he or she has already selected while you were shopping at the grocery store. When you took the item away, was your child calm and well-mannered about it? Or was the response similar to those submitted for a recent Jimmy Kimmel spot in which the talk show host asked parents to tell their children, “Hey, sorry, I ate all your Halloween candy.” (Permissions permitting, the videotaped results of this rather non-academic study are here: http://www.huffingtonpost.com/2011/11/03/jimmy-kimmels-ate-halloween-candy-challenge_n_1074334.html)

In a 2006 opinion piece in the New York Times, Jaqueline Woolley wrote, “Children do a great job of scientifically evaluating Santa. And adults do a great job of duping them. As we gradually withdraw our support for the myth, and children piece together the truth, their view of Santa aligns with ours. Perhaps it is this kinship with the adult world that prevents children from feeling anger over having been misled.” What is this “kinship with the adult world” of which Woolley writes? Is it the tacit understanding that adults lie, and that it is OK for them to lie (or “support a myth”) on a grand scale?

(The link to the Woolley article is here:
http://www.nytimes.com/2006/12/23/opinion/23woolley.html?_r=1&oref=slogin)

Surely someone sees this Santa thing differently. For balance, I turned to a group whose opposition to myths and distortions is part and parcel of their identity: the objectivists. This group is huge these days with Republicans and the Tea Party, both of which have renewed a fervent interest in the writings of Ayn Rand, particularly as it applies to self-determination and self-interest. Surely, the somewhat socialist “give liberally to the poor children of the world” Santa myth (I base that description on the story’s historical roots in relation to Saint Nicholas, who, by the way, was also the patron saint of pawnbrokers) would be anathema to such a group. And it is.

According to Andrew Bernstein, a senior writer of the Ayn Rand Institute, “”Santa Claus is, in literal terms, the anti-Christ. He is about joy, justice, and material gain, not suffering, forgiveness, and denial.” Another quote from the article: “The commercialism of Christmas, its emphasis on ingenuity, pleasure, and gift buying, is the holiday’s best aspect—because it is a celebration, the achievement of life.”

(You can read the full piece, a celebration of the commercialism of Christmas, here:
http://www.aynrand.org/site/News2?news_iv_ctrl=1263&page=NewsArticle&id=7632)

All of this leaves me as puzzled about Santa Claus as I was when I learned the dark secret of his nonexistence. To this day, I give presents that have “From Santa” scrawled on the tag, and I try to mask my own handwriting despite the fact that the recipients know they’re from me. Likewise, I love surprise presents: gifts that appear out of the blue from anonymous sources, those random acts of kindness that rekindle our faith in human generosity. (Special kudos to Ben and Jerry’s for a coupon they once published that granted a free ice cream cone to the person in line behind you at one of their scoop shops. Brilliant.)

The spirit of Santa lives on and is no lie. It survives despite the increases in greed and entitlement—both running rampant through our society today, malignant cancers that question and threaten human compassion and generosity. I’d even argue that the spirit of Santa, despite its secularization over the decades, also maintains its ties to the spirits of nearly every religion, even those that claim independence from mythology or dogma.

In the years ahead, perhaps we can pull that spirit back from fiction and establish it fully as year-round fact. After all, nearly every child longs for Santa to be more than a seasonal fantasy. Maybe it is up to the child within us adults to make it so.

Postscript: I dedicate this blog entry to my father (pictured above as Santa) who passed away in 2011 and was very dearly missed this Christmas season. His many gifts to me continue to resonate throughout my life.

Touching the Nerve: Taking on “Tebow Time”

[Music cue:] Sound effect from “127 Hours,” just as Aron Ralston is about to slice the blade of his knife across the exposed nerve on his self-severed arm.

   Over the years, I’ve learned that the best subjects to write about are the ones that touch nerves, and nothing seems to be touching nerves these days quite like the “miraculous” comebacks staged by the Denver Broncos. Strong feelings, pro and con, have led to strongly worded commentaries in the media and threatened to fracture friendships across the country. Obviously, it’s a topic worth taking on.
   For those not familiar with football, the Broncos are the current leaders in the AFC West after having been dismissed by the pigskin pundits as a longshot for the playoffs. That was before rookie quarterback Tim Tebow took over the starting star position and led the team to win seven of its last eight games in clutch situations. Tebow is most widely known for his overt religious beliefs, which has led him to inscribe Bible verse references under his eyes during games, appear in an anti-abortion advertisement last year, and kneel down in prayer frequently during games to beg Jesus Christ for assistance. Tebow is also known for his rather mediocre NFL passing statistics coupled with a strong preference for holding on to the ball and running to make plays by himself.
   As a person, Tim Tebow appears to be a natural leader. He has done an extraordinary amount of charitable work in his life to date—far more, most likely, than either his most vocal critics or advocates. His performance as a college quarterback made him a worthy recipient of numerous awards. In short, he seems like a pretty decent guy, especially in the company of his peers. He’s not working on a criminal rap sheet, not being caught in scandalous romances, and not making weekly headlines for trash-talking about his opponents.
   So why all the hullaballoo? It’s not who Tim Tebow is; it’s what he stands for. With that in mind, let me make one thing clear at the outset: I’m talking about “Tebow Time” here, not Tim Tebow himself. This is a classic case of a public schism between man and myth, self and symbol, fact and fiction, and, in some ways, between reality and fantasy. No wonder it’s got so many people ticked off.
   Americans, like many folks around the world, love their myths and legends. People are willing to twist history into a desired narrative in order to explain the currently inexplicable, maintain the established order, or justify unproveable beliefs.  Over the weekend, I noticed a number of commentators using the words “script” and “narrative” to describe how, yet again, the Denver Broncos were trailing in the final minutes of a game and somehow, miraculously (there’s that word again), won a game. According to an account I read this morning, Tim Tebow won the game against the Bears with two crucial field goals: one in the fourth quarter and one in overtime.
   Of course, a writer/editor craves accuracy and specifics, so I must stop here to point out that Tim Tebow did not kick the tying or the winning field goal. He’s a quarterback, remember? But here’s the first thing about “Tebow Time”: For many people, it’s all about Tim Tebow. People desperately want it to be all about Tim Tebow. Why? Because to a great extent, American myths and legends are about individuals, not collectives. How many times have you heard this line in a movie: “You’re the only one who can save us.” From “The Matrix” to “Avatar,” this messianic streak is alive and well in American culture. You could argue that it matches the political tenor of the times: rooting for a collective team (of Muppets, let’s say, just to cite a most recent critique) seems, at its core, suspiciously socialist. So, let’s foreground Tebow and background the Broncos for the creation of this particular gridiron myth.
   This creates an immediate problem. As a quarterback in most games, Tebow’s performance has been sub-par. Just look at the statistics and compare them to any other quarterback playing the game today, rookie or not. Few people seem to be suggesting, however, that God or Jesus Christ or the Holy Ghost is motivating Broncos kicker Matt Prater, who scores so many of the “miraculous” winning points. In most accounts of the Broncos phenomenon, I’m sorry to say, Matt’s been a footnote. (Though I must point out, the poet in me craves a “Pray for Prater” campaign.)
   So at the outset, a little bit of dissonance creeps into the picture, but let’s just ignore that for now. (If this is going to be a truly American myth we’re creating here, we’ll have to ignore the inconvenient facts for a while.) Instead, let’s consider that the Broncos are always a come-from-behind team, which makes them the underdogs in nearly every game. Despite its current standing as the #1 nation in the world, Americans love to consider themselves outsiders and underdogs. Who knows why; they just do. More on that in some future blog entry.
   As come-from-behinders, Tebow and the Broncos always appear to be beating the odds. The “lamestream” media can make all the predictions they want; “Tebow Time” is all about pulling it out in the clutch. And if this is God’s plan, as Tebow fanatics would have us believe, it leaves a few uncomfortable questions. First of all, why does God let so many of Tim Tebow’s passes miss the mark? Why do the Broncos fall behind in nearly every game? Why is God such a tease? Why does God’s will always seem to necessitate and instance of dumb luck? If Tim Tebow is truly representative of divine forces on Earth, why did God not anoint a better quarterback? Why, for example, is there no halo around Aaron Rodger’s head? If there’s any argument to be made for true grace and strength in football these days, the holy land would be in Green Bay, Wisconsin (even if, technically, the Bronco’s Mile-High Stadium is closer to heaven).
   One of the likely appeals of “Tebow Time,” however, is that Tim Tebow looks like a regular joe. Aaron Rogers used to have that look, but these days too much has been made of his swagger and confidence for him to satisfy the casting call for common-man hero. Instead, we have the narrative of the previously down-and-out Denver Broncos being led to victory by a back-up quarterback without any glitzy or glamorous advertising contracts (yet). The fact that some people question his skills and abilities just makes him more like us. After all, it was no miracle that Eli Manning scored two touchdowns against the favored Cowboys to win in the final minutes of the Sunday night game. You expect that from a Manning. But a Tebow? Nah, he’s not one of the “elite.” He’s one of us. His wins are, in a word, all the more “miraculous” because of that. We can relate to that. If it were you or me on that field on Sunday, we’d need a miracle to pull off a win as well.
   By definition, a miracle is something out of the ordinary, something unexpected or unprecedented. When people rhapsodize about “Tebow Time,” they often suggest that they’ve never seen anything like this before. But let’s again look back on American mythology. We have seen this before. In fact, in hard economic times, we see it time and time again. Consider James Braddock, the supposedly down-and-out boxer from the 30’s whose inspirational rise to the championship became a national fixation during the Great Depression.  Or Seabiscuit, the odds-against underhorse who likewise inspired hope in the odds-against masses of the Depression. In more recent times, I’d even mention the post-9/11 New England Patriots with their own fresh-from-the-bench backup in the lead, Tom Brady. In all of these cases, America latched on to an underdog, finding hope in those who rose despite serious adversity.
   But wait a moment. I may have gone a step too far here. Oh, those beloved Patriots of old, who refused to be introduced as individuals in Super Bowl XXXVI, instead staying true to their claims of being “a team.” They were up against the clearly favored St. Louis Rams, led by a man as God-fearing then as Tim Tebow is today: Kurt Warner. And lo, the Rams lost. In the final moments. By a field goal. Dear God.
   I’m not writing this to sing the praises of kickers like Adam Vinatieri, though I certainly could after that clutch kick. I’m writing this because writers like narratives, and the current Denver Broncos story is one of today’s most talked-about examples. But like many writers, I question narratives that distract from the central questions, whether those questions be of the narratives themselves or the contexts in which those narratives occur. And with “Tebow Time,” the central question seems to be about the positive influence of Christianity in major-league athletics.
   My argument is simple: It’s not that simple. It’s never that simple. People wouldn’t be risking careers and friendships if it were that simple. On any given Sunday (great movie, by the way), professional athletes praise their Almighty and point to the sky all the time after a great play. Keep in mind that a fair number of darn good pro football players are Muslims, by the way. Oh, how this narrative would play out differently if Tim Tebow were praising Allah and facing Mecca after every victory.
   But something in America, nearly all of America, craves a new hero these days. We’re looking for someone like us, facing oppressive challenges and persevering despite dominant adversaries. With the political and economic outlooks both bleak, we want a light in the darkness. We want reassurance, an optimistic narrative, an uplifting myth. Some may turn to movies and music, others to fiction and poetry. Others will look to the stadium on winter’s Saturdays and Sundays.
   Even so, some Americans don’t want that dream to have a religious prerequisite. They don’t want it to have political or financial prerequisites, either. We’d prefer that it take place on that mythological “level playing field,” especially as so many other myths seem to be crumbling around us.
   This seems to get at the heart of the “Tebow Time” narrative. With “Tebow Time,” there is no level playing field. If we take that myth at face value, then no amount of skill, talent, spirit, or grace will help you win in the end. After all, Tim is the Chosen One. He is the only one who can save us all. The health and survival of America’s entire professional sports conglomerate depends on that one person.
   If you mistake that myth for reality, then God help us, every one.