Signal and Noise: A Response to the Newtown Tragedy

SignalNoise

As the nation continues to reel from the news and aftershock of yet another mass shooting, many of us are wrestling with our own thoughts and emotions. We review the facts and details as they become available, and from there we try to create a rational explanation for a monstrous and irrational act. It’s a fool’s errand, really, like trying to negotiate with a madman, and yet we make the attempts. And when we fail, as reasonable people are most likely to do, we look for ways out of the conundrum.

Mostly, we focus on ways to fit this communal tragedy into our own personal understanding of the world, however short-sighted or narrow-minded that understanding may be. In the libraries of our minds, we try to shelve this latest tale of terror under psychology or religion or politics or feminism. “Experts” from all of these disciplines have appeared on a range of talk shows and news broadcasts lately, variously depicting the shooter as a godless gun nut, an insecure young white male, an autistic videogame addict.

I’m normally drawn to literature for explorations of human madness, but this time around, I’m finding more appropriate parallels and metaphors in music, specifically in the science of sound. Maybe this is due to the title of Nate Silver’s recent book, The Signal and the Noise, which looks to mathematics and statistics as they relate to pundits and would-be prognosticators. More specifically, Silver discusses attempts to predict future events, such as elections, and wonders why most of these attempts fail.

At this moment, here’s the future-oriented question most of us struggle to answer definitively: How can we keep something like the Newtown killing from ever happening again?

Such a complex question inevitably inspires a great deal of “noise,” the static and chatter that surrounds and distorts a sensible discussion despite its tangential relevance. For example, we’ve heard many conversations about “mental illness” despite the lack of any confirmed diagnosis in the case of the Newtown shooter. Others have attributed his motives to celebrity-seeking, despite any evidence to that effect, and thereby condemned the media. One person ironically hijacked the persona of a media celebrity (Morgan Freeman) in order to make his own anti-media point more popular in the world of social media.

All of this is noise. When we listen to it and perpetuate it, we lose track of the signal: the clear message buried inside.

Signal and noise have been core components of electronic music for decades now. I grew up during a time in which music was transformed by the appearance of the synthesizer. These electronic devices generate a source tone (for example, discrete sounds like sine waves or random signals like white noise) that is subsequently processed through a series of oscillators, filters, and envelopes to create a final note or sound (helicopter rotors for a Pink Floyd album, for example). The resulting signal can then be amplified and heard through headphones or stereo speakers.

The science of synthesizers may be complicated, but I want to focus on two concepts related to signal and noise: filters and amplifiers.

The goal of a filter is to block and withhold unwanted material, such as a particular range of sound. An equalizing filter, for example, allows you to lessen (or conversely boost) the amount of treble or bass you might hear through your stereo system. Some physical filters, such as a coffee filter, are quite refined and prevent small particles from passing through. In a weird way, a bulletproof vest is a crude sort of filter; its function is to block the passage of ammunition through to the wearer.

Throughout our lives, we develop multiple filters to help us process and understand the world around us. Otherwise, we might feel completely overwhelmed 24/7/365—as if some of us don’t feel that way already. Some of these filters come prepackaged—in academic or religious instruction, for example. We construct and employ others through our own experience, the lessons learned in life. Like the bulletproof vest, these filters shield and protect us; they give us the strength and courage to venture into potentially dangerous situations, both physically and mentally. They help us face up to monstrosities.

At this point, perhaps a line of poetry is in order:

The heart knows no filter.

If you’re like me, the initial news of the Newtown massacre came as a shock to the system. My first responses were all raw emotion: grief, rage, fear. Heartache, of course, because the heart, not being bulletproof, was wounded.

I might add disbelief to the list, but I can’t. For starters, it would be a lie; Columbine and Aurora and all the previous mass shootings should have already prepared us for this. On top of that, disbelief is a secondary, filtered response. The prefix “dis” serves as the filter, processing the initial state of belief. When we said we couldn’t believe the news of the shooting, what we were really saying was that we didn’t want to believe. The filter acted to block the truth, to deny its passage and place in our world.

Pressing further, I would argue that, due to the intensity of our emotional responses, many of us were rendered temporarily insane by news of the Newtown shooting. This would explain our disbelief, our denial of reality, our futile grasp at adjectives such as “unthinkable,” “unimaginable,” and “inconceivable.” By the very definition of insanity, we “lacked reasonable thought.”

And so began the noise, the procession of unreasonable responses to the tragedy. Despite a nearly total lack of evidence, analysts began offering possible motives and explanations for the violence, many of them tailored to fit their particular areas of expertise or concern. Conversations about gun control gave way to discussions of mental illness and health care, all based on conjecture. One viral essay, a blog post from the mother of a difficult and sometimes violent child, claimed such a deep level of understanding that she equated herself with the shooter’s mother. It was an absurd reduction based on an unknown and complex situation. Such extreme filtration of the facts resulted in one of the most highly illogical and self-centered responses to the shooting, and yet it continues to dominate some discussions today.

Many of these filtered responses were amplified via the media and social networking sites like Facebook and Twitter. Immediately after the event, many of us reached out to family members and friends via phone calls, e-mails, and status updates and expressed our deepest, most sincere thoughts and emotions. Subsequently, we began to pass along the thoughts and feelings of others, mostly those that corresponded closely to our own beliefs. In other words, we filtered out the rest. We began to process the signal and, in some of the more extreme cases, distorted and completely lost it.

I thank our President Barack Obama for providing such a relatively measured and balanced response in the wake of this tragedy. In doing so, he reminded me of the kind of leader the country needs at a time like this, someone more like Mr. Spock than the Incredible Hulk. He did not indulge the vigilante superhero fantasies that preoccupy so much of the American mindset and perhaps contribute to this sort of violence in the first place. (I urge you to read Stephen Pinker’s book The Better Angels of Our Nature: Why Violence Has Declined for a much more thorough exploration of this.)

As Nate Silver reminds us, logic is a tough, impartial judge. Like the rules of law, facts and statistics are indifferent to your or my personal opinions or beliefs. As filters go, logical analysis is our most reliable option, difficult though it may be. That’s why, in the wake of extreme tragedies like Newtown, we must continue to educate one another and ourselves in the facts of the matter. It’s also why we need to place a renewed emphasis on logic in the education of our children. No tool will be more useful to them in the course of their lives.

With that in mind and by way of example, I offer some attempts at a logical analysis of various reactions to the Newtown massacre below, keeping in mind the question: How do we keep something like this from ever happening again?

We need more, not fewer weapons. Pro-gun people often claim that an armed teacher or principal (or mall shopkeeper, movie theater attendant, member of the clergy, bystander…all of us, really, I guess) could take down hostile trespassers and prevent deaths in situations like this. Studies have shown otherwise. A more fully armed populace might prevent deaths, but it wouldn’t prevent the situations themselves, so this is at best only a partial solution to the question above. It might also have the opposite effect of increasing the number of such incidents, since more weapons would be in circulation and potentially available to would-be shooters.

Let’s say we pass a law (and some politicians have proposed such laws) that mandates weapons for school officials and teachers (and, by extension, shopkeepers and movie theater attendants and so on and so on). By that reasoning, the willingness to carry a weapon and undergo extensive training in its use would become a prerequisite for anyone applying for these jobs. You would need to feel ready and able to shoot and kill another human being when called upon. Pacifists, whether on religious or just plain moral grounds, no longer need apply, at least not until all the discrimination lawsuits have been settled.

Also, if we truly want to prevent any future bloodshed by arming ourselves even more, we have to trust in our ability to act fast and first—probably without time to fully assess a situation, and certainly without time to attempt a diplomatic or talk-down resolution—which means responding with immediate gunfire to any threat, real or perceived. This is the rationale behind the “stand your ground” laws in some states, which, as we have seen in the case of Trayvon Martin, can sometimes result in the death of innocent children.

These pro-gun arguments also conveniently ignore (filter out) the fact that more and more of today’s mass shooters wear bulletproof vests or combat armor on their sprees. Unless we’ve all been trained to be reliable sharpshooters, our chances at taking down an armed maniac before he takes us down are therefore slim. Shooting at him becomes more like kicking at a hornet’s nest.

In a “good-versus-evil” shooting match, there’s also a high probability that innocent bystanders will be wounded and/or killed by “friendly fire,” especially in dark or smoke-filled spaces such as the Aurora movie theater. I suppose that in order for all of us to feel completely safe from potential gunfire, we should be wearing bulletproof armor at all times. Consequently, our country begins to sound less and less like “the land of the free and the home of the brave.”

We need to restore God in the classroom.  For starters, this argument always seems to focus first on the victims, not the shooter himself. Secondly, it baits an argument about whose God we should all bow down to. Thirdly, this argument conveniently neglects (i.e. filters out) the large number of violent historical events, some described in gruesome detail in the Christian Bible, in which God supposedly commands others to kill on his behalf. These people, some of them respected figures in religious history, dutifully obeyed the voices they heard in their heads. Chew on this: What if the all-knowing God spoke to the Newtown gunman and told him, “There is one and only one way to wake up Americans to the need for gun control. If you go into a kindergarten and kill dozens of small children, I guarantee you that gun laws in your country will change and that thousands more lives will subsequently be saved.” Religious philosophers, discuss.

 We need to prevent the mentally ill from owning guns. For starters, the buyer and owner of the guns used in the Newtown killing was the man’s mother, and she wasn’t mentally ill. By all accounts, her son wasn’t officially diagnosed as “mentally ill,” either. Therefore, this restriction would not have prevented the slaughter in Newtown.

Restrictions that focus on mental illnesses also seem to assume that such conditions are both evident and permanent, that they manifest themselves from birth and remain consistently visible throughout a person’s lifetime. Anyone who has the slightest inkling of knowledge about psychology knows this is hogwash. Do you feel the same amount of mental stability each and every hour of each and every day? Did you feel 100% mentally stable when you first heard the news about the Newtown killing?

If we feel that we must prevent the “mentally ill” from owning guns, then we must establish an “acceptable” level of illness when it comes to owning guns—that level at which a person is at risk of harming either the self or another living creature. One could argue that nearly everyone feels capable of harming the self or another living creature at some point in his or her life, either during a case of severe depression or in a moment of stress-related rage. With that in mind, the “no guns for the mentally ill” proposal could mean, in its most preventative application, no guns for anyone.

Speaking of acceptable levels of mental illness, let’s consider all those people who heard about Newtown and felt compelled to rush out to buy more guns. Some felt sure that they needed these weapons to protect themselves and/or their loved ones from future incidents. Others were afraid that their personal gun rights were about to be stripped away, so it was best to stock up now and hope for a grandfather clause that would protect their stash. Such extreme and self-centered responses to this national tragedy suggest a form of delusional paranoia, itself a mental illness, yet these are the supposedly reasonable folks intent on protecting us all from the irrational gunmen.

Here’s another inconvenient fact related to mental health: In many gun-related incidents (though not, it seems, Newtown), a major factor in the shooting is the consumption of alcohol. If we expand our scope to consider all alcohol-related fatalities, we find many, many more deaths of innocent people and children caused by drunk drivers each year. Grouping these deaths together sends a clear signal; spacing them apart renders them more like noise. That may explain why we see few people either suggesting or rallying around a proposal to ban alcohol in order to prevent deaths, either by gun or by car. Another argument for another time, perhaps.

Stricter background checks will prevent these killings. This is an argument of hindsight, which, as we all know, is 20/20. The purchase of a gun will always precede a criminal’s first armed robbery, rendering a background check ineffective. What we really need is a foreground check. Absent psychic powers of prognostication or the development of software that aggregates all your personal data and predicts whether or not you’ll become a mass murderer, we’re still a few miles shy of the finish line with this proposed solution.

Ban assault weapons. This one makes sense, though once again it’s a partial solution. Even if it can’t stop shooters from doing harm, it can at least limit the amount of damage done. In other words, it saves lives.

Think of it this way: an assault weapon is an amplified version of a single-shot gun. It makes a weak signal stronger. It allows one lonely, unstable voice to send out a disproportionately loud message. Even though that message may be unclear or irrational, it echoes throughout the culture, filtered and amplified by our own voices as we try to make sense of it, voices that are in turn amplified by the many forms of media at our disposal.

What we’re left with is chaos and cacophony. In the world of music, it’s called feedback.

Put more simply, it’s noise.

Advertisement

Callings

Hugh Jr. and Hugh Sr. outside Gillette Stadium, one of the last photos taken of my dad

 

“You missed your calling.”

Fathers say perplexing things sometimes, and Fathers’ Day seems like as good a day as any to remember them. In a solemn moment, my father offered the above observation several years ago, and it haunts me the way an echo haunts a canyon—mainly because I’d been turning the same notion over and over in my mind for many years before that.

Some people are born lucky and have but one calling that speaks to them loudly and clearly all their lives. When I see these people—and they show up with a refreshing frequency on reality shows such as “American Idol” and “So You Think You Can Dance,” the latter being one of my absolute favorite television experiences of all time—I feel stirrings of both kinship and envy. I admire their devotion and dedication to their talents and consider the depth of my own past commitments to the written word. At the same time, I am reminded of how often my loyalty to literature has slipped and faltered. There too, echoes from the past resound.

I grew up in an era when “doctor” or “lawyer” were the two top attainable career goals for one’s children. Sure, “president” got mentioned fairly frequently, but I could always detect a catch in my parents’ voices when they said it. There was realism and practicality—common sense, my father frequently extolled—and then there was the dreamer’s realm of fantasy—those “pie in the sky” aspirations that might tempt us for a while but would ultimately bring us up short. As anyone who grew up playing the game of “Life” knew: you drew the largest paychecks from “doctor” and “lawyer,” and “president” wasn’t a viable option on the game board.

Alas, from an early age, I chose another option that wasn’t on the board: the dreamer’s realm of fiction writing.

When I watch the interviews on talent shows, two types of participants often reduce me to tears: contestants whose family members so fully support their goals that they’ll pack up the minivan and drive across the country together for the auditions, and contestants whose families have abandoned them and left them to follow their artistic aspirations alone (and let’s be honest, when that aspiration is something like dance, there’s more than a little homophobia at play in many of the reactions).

“You’ll never make a decent living at that,” my father warned me when I first mentioned my plans to become a fiction writer. “Do you really think someone would want to publish something like that?” he asked after reading one of my short stories. “I’d be ashamed of myself,” he added—his subjective take on the choice of a front-and-center gay narrator, perhaps.

It was the last time I would show him any of my work. Years later, after that story had been published along with several others, he summed up his feelings by unknowingly paraphrasing the final line of a James Wright poem that has haunted many an aspiring writer:

 

I lean back, as the evening darkens and comes on.

A chicken hawk floats over, looking for home.

I have wasted my life.

 (from “Lying in a Hammock at William Duffy’s Farm in Pine Island, Minnesota”)

 

My father may have been drunk when he informed me that I had wasted my life; after all, he rarely called me when he was sober. Even so, his feelings were made quite clear when he demanded that I forego my own graduate-school commencement ceremony in order to come home and attend my sister’s graduation from medical school. At the party following that event, my father mentioned matter-of-factly to a friend that I, too, had just graduated from school, the University of Idaho. The only problem with that was that I had just received—in absentia, of course—my MFA degree from the University of Iowa, the premiere writing program in the country.

All water under the bridge, I told myself as I headed back to my hometown a year ago to spend extended periods of time with my father in his final months. During those stays, the inevitable questions arose: “Why didn’t you become a doctor?” “Did you ever think of becoming a lawyer?”

I answered them dutifully and honestly, in order: “I can’t stand the sight of blood,” and “Yes, and I haven’t ruled it out entirely just yet.”

These questions about medicine and law were, in their own way, high compliments. He acknowledged that I had done well enough in school to pursue and excel in either one. By then, however, my father understood that neither career option reflected my true calling. He still wasn’t completely sold on the idea of writing, and I confessed that there were far too many days when I wasn’t, either.

Even so, my father had one request to make. He asked if I would write something exclusively for him: his eulogy.

My father did not ask me to do this because of some deathbed epiphany that his son was, indeed, a writer. He was harking back to his final words in that conversation from years earlier: “You have missed your calling.”

At the time, we had been discussing matters of the soul and spirit, and my father was suddenly filled with the belief that I should have gone into the ministry. “That’s your true calling,” he said. “You should be writing sermons instead of short stories.”

Now, nearly a year after his death, I reply: “Dad, at their best, they are one in the same, just like us.”

Liar, Liar, Pants on Fire, Hang Them Up on a Telephone Wire

"2 Men Looking On" from the "Perched" series by artist Jodi Chamberlain. Her multimedia work can be found at http://www.jodichamberlain.com (just click on the painting). Copyright by Jodi Chamberlain; used with permission of the artist.

By way of an introduction, I did some Internet research on the origin of the expression “Liar, Liar, Pants on Fire, hang them up on a telephone wire” and found this reference to an 1810 poem by the English writer William Blake:

 The Liar

Deceiver, dissembler
Your trousers are alight
From what pole or gallows
Shall they dangle in the night?
 
When I asked of your career
Why did you have to kick my rear
With that stinking lie of thine
Proclaiming that you owned a mine?
 
When you asked to borrow my stallion
To visit a nearby-moored galleon
How could I ever know that you
Intended only to turn him into glue?
 
What red devil of mendacity
Grips your soul with such tenacity?
Will one you cruelly shower with lies
Put a pistol ball between your eyes?
 
What infernal serpent
Has lent you his forked tongue?
From what pit of foul deceit
Are all these whoppers sprung?
 
Deceiver, dissembler
Your trousers are alight
From what pole or gallows
Do they dangle in the night?

If you do your own Internet research (and of course you do; you’re probably verifying this on Google right now), you’ll find numerous references to this Blake poem, some of them in fairly reliable places. There’s only one small problem: William Blake didn’t write it.

Additional research has yet to turn up the actual author of the poem or its place and date of origin. I did find, however, that the word “alleged” had at one time been attached to the claims of Blake’s authorship. Some of those who later reposted the poem and its origin myth carefully edited out the word “alleged,” perhaps to boost their own authority. I’ll leave it to the Blake scholars to hash out why this could or could not possibly be an obscure verse from the poet who gave us “Tyger tyger, burning bright”—which, when you say it loud, does share some strong poetic similarities with “Liar, liar, pants on fire” after all.

• • •

I am constantly amazed and dismayed by the number of people who both post and spread unproven statements and “facts” on the Internet. Quite a few of these people are close friends, many of them remarkably intelligent people. Even so, they often clamor to be the first ones to expose some new controversy or conspiracy via Facebook or Twitter. Maybe it’s the journalist in all of us, desperate for a scoop, but if there’s one thing I learned while staying with my ailing father and watching endless episodes of Judge Judy, Judge Jeanine, and Judge Joe Brown, it’s that hearsay is irrelevant in a court of law.

Let’s stop a moment to take a look at that word, hearsay. We hear something, then we say it. From ear to mouth in an almost direct line that bypasses the brain. It’s a key ingredient of soap operas and celebrity gossip, yet it also sadly infects the serious discussions of the day’s major issues. Perhaps in this technological age we need new words for hearsay, such as readshare or seetweet. “I read those statistics on someone’s Facebook wall, so I immediately ‘shared’ them by reposting on my wall.” “I saw a sports commentator on television talking about that big trade that might happen, and so I tweeted that it actually had because he sounded so convincing.”

Another classic example is the Reverend Martin Luther King, Jr. quote that went viral immediately after the killing of Osama bin Laden. Here’s the sentence in question: “I mourn the loss of thousands of precious lives, but I will not rejoice in the death of one, not even an enemy.” I have no problem with the quote itself, but King never said it. All credit is due to Jessica Dovey, who prefaced an actual King quote with that sentence. In its original form, she even set her own thought outside of the actual quotation, clearly distinguishing her words from his. (The teacher in me can’t help but point this out: Punctuation matters!) As the quote ricocheted around the Internet, however, the quotation marks got knocked off and the two speakers’ sentiments became one. Since the Internet never forgets, there are now thousands—if not millions—of lingering misattributions.

After this incident, I began to wonder just how many of those inspirational/motivational “posters” that people share contain actual quotes from the the pictured sources. With today’s technology, anyone with a basic knowledge of photo editing can attribute just about anything to anyone and fool millions in the process. Just look at all those often-shared fake Ryan Gosling quotes. It’s all in good fun…until someone starts attributing hateful or hurtful language to him. (Chances are probably pretty high that someone already has.) Then what started as parody or satire slides uncomfortably across a blurry line into libel and slander.

I know I come across as a killjoy here, but on matters like this, I reveal my true colors as a classic skeptic. I reserve judgment until I can ascertain the facts of the matter, and I remain open to the possibility that objective reasoning may lend credence and support to multiple sides of an argument, even those that might contradict my own previously held beliefs. Like William Blake and many other artists of the Romantic era, I seek a direct link to the primary source—though, devout skeptic that I am, I question his subsequent claims to a mystical intimacy with a supremely divine being. Perhaps his position in literary history—at the crossroads of the Enlightenment (or Age of Reason) and the Romantic era (or Age of Emotion)—describes my own inner identity crisis as a writer. I would also argue that it describes the manner in which America currently teeters as a political force in the world, and our casual (not causal) relationship with the truth seems to be a reliable measure of that tension.

As an educational writer, I am often governed by a remarkably strict and lengthy set of mostly objective state standards and guidelines, even as I am asked to write fairly sentimental pieces, such as articles about developmentally challenged children overcoming adversity. Age of Reason, meet Age of Emotion. For a recent series of writing assignments, I was required to provide two credible and reliable sources for each sentence (yes, each sentence) in nonfiction articles. When and where possible, I was to cite the primary source—in other words, an eyewitness account or an original text, not some second-hand or third-hand citation or discussion. After all, elementary school standards (i.e. grades one through six) stress the significance and importance of primary sources in research. This is not “elitist, ivory tower” graduate-school-level academia stuff.

In my own college-level classrooms, I likewise impressed upon students the necessity of primary sources in validating claims, especially as we grappled with sensitive and controversial issues such as abortion, capital punishment, and religious freedom. By extension, I reminded them that primary sources were crucial in evaluating the claims made by advertisers and politicians, both of which would often seek to twist the truth to suit their own agendas. By further extension—and hoping to avoid turning those young people from skeptics into cynics—I reminded them that they needed to evaluate the primary source itself and take into account any conflicts of interest. After all, the makers of Brand X are likely to promote studies that show that their product is more effective than Brand Z, especially when those studies have been planned and paid for by the makers of Brand X.

I had hoped that this kind of education, common in most schools and colleges, would serve the students well in their adult lives and help them to make objective, reasonable, and justifiable decisions on all matters great and small, from the election of presidents to the selection of a new toothpaste. (Is it more important to remove plaque, avoid gingivitis, eliminate bad breath, or have whiter teeth? Honestly, a skeptic can spend hours analyzing the endless varieties of toothpastes offered by each brand these days.) I wanted students to realize the great potential and power they had to sort through all of the distractions and clamor around them in order to get at the truth, even if that earned truth challenged their preconceptions and ideological dispositions. In other words, I wanted them to be open to new information and perspectives, even if this led them to change their minds. After all, the human capacity for change and adaptation is a key aspect of our forward-moving evolution. Yes, I said it— and if you don’t believe in change or evolution, enjoy the view from your rut.

In the end, however, it appears that preconceptions and ideological dispositions still get the best of most of us. On social networks, people on the left post and spread unverified stories and statistics that seem to prove their political points; people on the right post and spread unverified stories and statistics that seem to prove their political points. Some of these are easily proven to be hyperboles, misstatements, or outright lies; others are carefully twisted and redirected versions of the truth (the process we all know as “spinning”). In nearly all instances, however, the posters (and re-posters) risk looking either naïve, misinformed, ignorant, or just plain stupid when the claims are analyzed and the facts made clear.

That risk—and the willingness of so many people to accept it for the most trivial of reasons—fascinates me. Checking the validity of some of these claims and stories takes less than a minute, especially if you’re clued in to sites such as snopes.com and politfact.com. (Disclaimer: These sites are good places to check first, but on their own they are not definitive arbiters of right and wrong. They can, however, provide valuable leads in a search for those coveted primary sources I mentioned earlier.) Despite that, there is still the temptation to hit that “share” button, which takes but a split second and provides that little hit of adrenaline that information junkies seem to crave. After all, if they wait too long, someone else might scoop them and ruin their status as the first (if not primary) source of breaking news. Take that, FoxNews and CNN!

(Topical side note: The death of Whitney Houston on February 11 was a prime example of this. Even before reports had been confirmed, dozens of people raced to post the story as fact on their Twitter and Facebook feeds. Perhaps they all wished to be, if I may twist the New York Times’s catchphrase slightly, the “tweeters of record.”)

In many instances, perhaps, the risk of being wrong is offset by a firm belief that we are right. Even if the statement is later shown to be false, the underlying “truthiness” of it remains. (Thank you, Stephen Colbert, for having introduced that word into the public lexicon. We clearly needed it.) To explain this more plainly, the truth doesn’t need to actually be true as long it truthfully supports what we truly believe to be true. This is why politicians can make wild claims in our legislative chambers (such as the “fact” that Planned Parenthood’s primary mission is to provide abortions) and later claim “no harm no foul” because the easily disproven “fact” was simply alluding to a basic “truth” anyway. (If I had to allow such reasoning in my classes…well, I couldn’t and wouldn’t. It undermines the entire premise of education, not to mention making a travesty of logic and reason. Civilized people simply don’t entertain this kind of idiocy.)

When it comes to believing the unbelievable (or giving credence to the incredible), it’s not just the lies that people tell that intrigue me—it’s also the lies that they hear and subsequently believe or let pass unchallenged. This takes us back to the crux of the reposting problem: people believe and spread untruths because, in some strange way, they WANT the untruths to be true somehow. Testing such a claim would be evidence of weakness or a lack of confidence: If it feels or sounds right to you, it probably is—no further research necessary. In other words, if you’re a cynic, pointing out hypocrisy justifies your cynicism. If you’re paranoid, spreading conspiracy theories justifies your paranoia. If you’re (liberal/conservative), propagating the latest (conservative/liberal) scandal gives you that higher moral standing that you just know your side deserves. If you hate Company A or Team B, then that damning news about Company A or Team B simply MUST be true.

So let’s bring this full circle. Over the years, I’ve come to believe that you can learn a whole lot about a person from the lies he or she tells. Further, you can tell even more about that person from the lies he or she is eager to believe, especially at the risk of reputation and social standing. The more entrenched the belief, the less likely the person is to respond to reason. As a result, dissonance becomes an unavoidable consequence, and with that dissonance comes discomfort, anger, and sometimes violence. In all the hullabaloo, reason and logic are often the first casualties.

Perhaps a first-hand case study would help to illustrate the point. At a town-hall-style meeting a couple of years ago, Senator Bernie Sanders was taking questions from the crowd. One man, most likely a member of the Tea Party, asked him to explain why the government keeps raising his taxes. Sanders pointed out to him that President Obama had just passed tax cuts for the majority of wage-earners, but this man was insistent: His taxes had gone up. Sanders tried to reason with him, pointing out that he appeared to fit the demographic of people who received a tax cut, but no, there was no reasoning with the man. His taxes had gone UP. The discussion, as Sanders subsequently remarked in his typically brusque style, had reached an impasse and could not continue in that forum.

Had the man’s taxes gone up? Probably not. But here’s a pet theory I have about all of this. Had the amount of money being taken out of his weekly or monthly paycheck gone up? That seems likely, especially since many companies have been increasing the employee co-share of medical benefits as health insurance costs continue to rise. The disgruntled man might easily have lumped all of these paycheck deductions under the simple heading of “taxes,” giving him evidence, however misinterpreted, for his claim and belief. And what legislation had been introduced to try to deal with the problem of rising medical insurance costs? Why, the very legislation that members of the Tea Party are dead-set against.

The man at the town hall meeting was angry, and he wanted that anger to be heard and validated. When the facts didn’t serve that purpose, he was faced with a dilemma. He could change his mind, or he could stand his ground.

His final decision: ignore the facts. Therein lies the root of the word ignorance: someone who can be shown the truth yet insist on the lie. Someone who can be told that something is fiction and yet still believe that it’s fact. In this respect, ignorance is a worse trait than naïveté or stupidity.

Just consider the synonyms for the feeble-minded: words like dull and dim. Ignorant people are hardly dull or dim. They carry the full force of their beliefs behind them. Their faith in one unwavering, indisputable view of the world can be both violent and catastrophic. After all, when one person runs through the village with his pants on fire, he or she can spread the flames to every surrounding building. That makes the practice an emergency worthy of public concern.

Luckily, we are all able firefighters. With all this “Information Age” technology around us, the truth is closer to each and every person than it ever has been in the history of mankind. And yet, the most powerful tools can often be used both to create and to destroy. A hammer can help build a hope chest for our dearest possessions, but it can also break a window or kill a man. One post can spark the overthrow of dictators and the rise of democracy, but it can also ruin careers and destroy a country’s economy.

I’ll end this section with a paraphrase of Tim Gunn from the reality show “Project Runway”: “Use the wall thoughtfully.” He was talking about shelves of fashion accessories, but he might just as well have been talking about Facebook. Whatever we do, we should do it thoughtfully, with care and consideration. To do otherwise would be, quite simply, uncivilized.

(NOTE: There is a second and perhaps even third part to this post already underway…)

The Protestant Work-Study Ethic


Recently, Republican presidential candidate Newt Gingrich has been getting a lot of attention—and a standing ovation at a recent debate in South Carolina—for his insistence that students in low-income schools be given jobs to encourage their work ethic and to discourage them from lives of crime. Here is part of the original quote:

 “You have a very poor neighborhood. You have students that are required to go to school. They have no money, no habit of work. What if you paid them in the afternoon to work in the clerical office or as the assistant librarian? And let me get into the janitor thing. What if they became assistant janitors, and their job was to mop the floor and clean the bathroom?”

Now, even though Gingrich’s most recent religious affiliation is Catholicism (he has changed denominations over the years), his proposal reflects the Protestant Work Ethic as originally set forth by the German socio-economist Max Weber in 1904. Yes, despite its later association with the Puritans who came to America, the idea is a relatively recent European import. Simply put, the Protestant Work Ethic connects hard work with earthly gains that reaffirm one’s spiritual salvation. It is also, in Weber’s understanding, the foundational core of capitalism.

Gingrich’s remarks led me to recall several experiences from my own past that shaped how I would subsequently perceive issues such as labor and class. Since many of these relate to the idea of work in school, or “work-study,” I thought they might be useful in exploring the topic from another angle. (Note that these are primarily anecdotal responses; they are not meant to be definitive studies of the sociological or economic theories at play here.)

The hard work paid off...

I was relatively naïve about issues of class (among other things) all the way into my late teens. The people in my neighborhood, a somewhat distant suburb of Boston, all seemed fairly equal in terms of family income and social status, as did most of the students in my public school. In my limited experience, “private school” meant “Catholic school,” and so seemed more of a religion-based option rather than something based on economic opportunity or educational ideology.

That all changed when I went off to college. I had worked hard in school, and my diligence paid off in the form of scholarships and financial aid awards. The prospects of having to work while I studied were nothing new; I had worked part-time as a dishwasher and record store sales clerk while attending high school. I understood the character-building significance—and financial reward—of earning my way, if only partially. After all, college wasn’t cheap, and I had my heart set on Middlebury College, one of the more expensive ones.

While at Middlebury, I held down various jobs: dining hall food-slinger (and later shift manager), professorial assistant, music librarian, data processing keypuncher, residential house director, janitorial helper, student tour guide, reunion host, and term paper typist (to pay for the typewriter I had purchased for school). As you might imagine, many of these were simultaneous, so my schedule was a patchwork of classes and work commitments. Thank God I wasn’t on a sports team, because there wasn’t much time left in the day or on weekends for practices and games. At the end of most days, while many of my peers headed downtown to drink or off to parties on campus, I finally had a chance to hit the books and do my homework. Around midnight, it was not uncommon for a last-minute “rush” typing job to come in, and so I could often be found at 2 or 3 in the morning, typing another student’s paper while they slept. We had no Red Bull back then, and I hadn’t yet begun my coffee addiction. For me, the work ethic boiled down to this: Work hard; then work some more.

Before you conclude that this is a “woe is me” tale, let me state this: I loved being so busy. Sure, I complained about some of the long hours and overcommitments, but I also met some of my best friends (both students and staff members) on the job, learned how to prioritize and multi-task, and got an insider’s education on how an American college operated. Many of these new friends were from backgrounds completely unlike my own, which broadened my perspectives in all directions. Plus, there was the pride of the paycheck at the end of each two-week period, even though most of it went right back to the college. (I guess this truly distinguished me from the “elites,” who, according to Gingrich the other day, are the only ones who “despise earning money.” Perhaps he meant that they prefer inheriting it? Like many others, I’m still scratching my head over that comment.)

At first, I was under the impression that everyone worked this hard at college. (Yes, Mr. Gingrich, the Protestant Work Ethic values had been instilled in me from an early age.) Then I learned what a “legacy” was: someone who got into college based, in no small part, on previous family ties. I learned what a “trust fund child” was: someone who had seemingly unlimited income from their parents, none of it earned through hard work. I came to understand more about private schools and how they functioned, to some extent, as a feeder system for the Ivy League and other top-quality colleges—if one’s family could afford private school tuition in the first place. This was all new to me, and so it took a while to adjust to the idea that friends of mine had multiple homes, routinely flew to exotic locations, or had jobs waiting for them the moment they stepped out of college.

That said, I often look back and wonder how much better I could have done in college if I hadn’t been scooping out mashed potatoes and ice cream for three to four shifts a week. I wonder how many more networking connections with influential people I could have made if I was hanging out at the local bar or fraternity at night rather than typing up those fellow Midd Kids’ term papers. Even though I managed a fair number of extracurricular activities, I wonder how many more I could have undertaken were my schedule not so riddled with the responsibilities of the jobs I held. I also wonder what post-graduation prospects a final, senior-year semester might have yielded. (I graduated a semester early to save some money on tuition.) And when Mr. Gingrich suggests that low-income students should be saddled with similar responsibilities while simultaneously trying to get an education, I wonder if he realizes what an unlevel playing field he is proposing, and the extent to which such a field favors the already-rich.

This was never more apparent to me than the day I was asked to sit down with Middlebury’s financial aid director to address a problem with my assistance package. Her reason for seeing me was short and sweet: I had to stop working. I had reached the upper cap for my work-study aid, and so could no longer receive any paychecks from the college. Slightly confused, I asked if I could continue working outside of the auspices of the financial aid office, as some other students did. The answer: No. Why? Because I was on financial aid.

“So let me get this straight,” I said to her, struggling to comprehend the conundrum. “Because I don’t have much money, I’m limited in the amount of money I can make. But if I did have money and didn’t need financial aid, I could make as much as I want.”

That was correct. But, of course, she pointed out, if I didn’t need financial aid, why would I work in the first place?

Hmmm. As Spock, the intergalactic king of logic, would say, “Fascinating.”

(As it turned out, the director’s paycheck was, in some ways, reliant on my own. Remember that job I had in data processing? Part of it was the computer entry of faculty and staff time sheets and salaries so that their checks could be printed on schedule. As my supervisor in the computer center pointed out to the financial aid director, I couldn’t be let go without jeopardizing the entire college payroll system. Somehow, over the course of the next few days, an “adjustment” was made to my financial aid work-study limit. Interesting side note: one of the other student keypunchers, likewise entrusted with some rather hefty responsibilities as part of her work-study award package, still works in that department to this day.)

Since that encounter, I remained keenly aware of how work-study programs affect the college community. It didn’t take long to witness a casualty of the program.

Based on how well my work-study experience had qualified me for post-college positions (another plus of the system), I was hired as something of an apprentice dean at Middlebury. I was mainly in charge of student housing, but my job description also required that I serve as an academic adviser to members of the incoming freshman class.

That was when I met Rachel (not her real name), a smart and vibrant first-year student with the instant likeability of a budding movie star. She was also a financial aid recipient and holding down a work-study job while meeting the demands of a rigorous academic schedule. Rachel had been incredibly nervous about starting out at Middlebury. She had also been accepted to a state school, and her family felt certain that they could afford four years there. Middlebury, however, with its higher prestige and much higher tuition, was a gamble, especially since financial aid at the time was awarded on a year-to-year basis. Who knew what assistance, if any, Rachel might receive in years two, three, and four? (I liken this, in some regard, to all those corporations demanding that tax rates and government regulations remain the same for years on end so that they can engage in long-term planning. Seen from this perspective, you can understand their point.)

Fast-forward three years from my one-year position at Middlebury. Back in Massachusetts, a friend and I had just seen a movie and were settling into a booth at a fast-food restaurant for post-film discussion. Who should come up to take our order but Rachel.

I was incredulous. This was the middle of the school year; why wasn’t she up in Vermont attending classes? As Rachel told me, she hadn’t received nearly enough financial aid assistance to afford a second year of Middlebury. Her grade-point average had slipped due to her work-study overload, which in turn made her ineligible for a merit-based scholarship. As a result, she had dropped out of school completely. Her family didn’t have enough money left over from the first year at Middlebury to cover the tuition at the state school. Now Rachel was living back home, working multiple jobs, and trying to make enough money as quickly as possible so that her Middlebury credits wouldn’t expire before they could be transferred to another college. Simply put, her earlier financial gamble hadn’t paid off.

As you might imagine, the entire college financial aid process has changed a great deal over the years. Some of it has no doubt improved upon the earlier models, while other changes (such as the relationship to need-blind admissions policies) have led to greater uncertainty and unpredictability.

Even so, Newt Gingrich’s comments reminded me of another, subsequent episode at Middlebury College. This time, I had returned to the campus to become administrative director of the Bread Loaf School of English, a graduate program of the school. Based on my earlier experience as a “baby dean,” I was invited to join the college’s Diversity Committee. At one of those meetings, I raised the matter of work-study and how it reinforced some rather negative stereotypes and expectations on campus in regards to minority students. One of my former colleagues from the Dean of Students office disagreed with my observations, and so I invited her to join me on a quick tour of the campus.

Together, we walked into the college snack bar and the main library. I asked her to look around and tell me what she saw. It took only a moment for the shock to appear on her face. Behind the counters and scrambling about the stacks, foreign and minority students were working hard, taking orders and reshelving books. Their “customers” were mostly white and affluent-looking students. There on campus, the intersection of diversity initiatives and work-study programs had created a microcosm of the American service industry. White, privileged students enjoyed the luxury of free time between academic and/or athletic commitments, while nonwhite students labored to meet those same demands in addition to burdensome economic challenges.

This, too, has probably changed over time, or at least I hope that it has. Mr. Gingrich’s recent comments, however, raised concerns that some of those lessons remain unlearned.

College costs continue to rise; student debt grows more difficult to manage. As our country’s income inequality expands, the very idea of a level playing field for all, at least in the educational context, remains at risk. With it, some of our nation’s best and brightest—perhaps those most capable of envisioning and implementing solutions—may never rise to their full potential. For them, despite what politicians like Gingrich believe, the selective application of the Protestant Work Ethic just compounds the problem—and in the end, doesn’t really work well at all.

American Anger, Part One

Preface: This is Part One of what I hope will be an ongoing, potentially year-long exploration of this subject. The topic seems well-suited to the “blog” format, serving more as a catalyst for conversation rather than a definitive treatise on the topic. I look forward to continuing the conversation in hopes of reaching some constructive insights, conclusions, and potential remedies.

As you’ll no doubt quickly note, my take on American anger is a rather personal approach; your choices for taking on the topic may no doubt differ. Despite that, I’ll be using terms like “Americans “ and the first-person-plural pronoun “we” rather liberally throughout the entries. I do this merely as shorthand, fully aware that it’s literary sleight of hand, both a contrivance and a conceit. I don’t intend to suggest that there are absolute universal truths here, especially since the insistence on universal absolutes in society tends to generate the very anger I’ll be analyzing.

As always, thanks for reading, and even more thanks to those who respond to provoke or inspire further insight.

 1. Use Your Words

American anger fascinates me.

Here we are, billing ourselves as the “best, greatest, richest, most powerful” nation in the world, and yet people all over the country claim to be angry. Watching the growth of the Tea Party movement in 2010 was like watching the now-famous scene in Sidney Lumet’s 1976 film “Network” in which unstable talk-show host Howard Beale inspires his viewers to lift up window sashes across the country and shout out into the night: “I’m mad as hell, and I’m not going to take it any more!” Everyone was mad as hell for different reasons, but there was a feeling that bringing all that rage together into one unifying cry might make it either coherent or effective. (Spoiler alert: it didn’t.) In many ways, it echoed a couple of the poet Walt Whitman’s famous lines from “Song of Myself”:

            I, too, am not a bit tamed, I too am untranslatable,

            I sound my barbaric YAWP over the roofs of the world.

It was not a specific word or words that Whitman called out into the night; it was not an intelligible phrase or clause. It was a sound, an utterance, savage and undomesticated, more animal than human. In a way, Whitman was suggesting, people had been making those sounds for years and would continue for many more, well beyond his own eventual death. We might never come to know who he was or what he meant, but discussion about it “shall be good health to you nonetheless.”

In this election year, 2012, we are hearing quite a few YAWPS across the political landscape, some less tamed and translatable than others.

In addition to all the contemporary social and political dissent, there is a perhaps an even more powerful undercurrent of dissonance—the lack of a rational link between one’s beliefs and one’s reality, however either one is perceived. It’s the feeling we get when we pay top-dollar for something only to find that it’s cheaply made or ineffectual. We vote for a candidate based on his or her promises only to find those promises later ignored. (To provide some continuity between this blog and an earlier entry on football’s “Tebow Time” phenomenon, dissonance was that sickening feeling the hyper-religious quarterback’s more fanatic fans experienced when the Denver Broncos were humiliated by the New England Patriots in a recent playoff game. For the sake of divisional fairness, it was also the sickening feeling the Green Bay Cheeseheads felt when Aaron Rodgers and the nearly-perfect Packers succumbed to the New York Giants the very next day.)

I’ll be talking much more about dissonance and its relation to anger later on, but it’s worth mentioning here just to keep the idea in mind as the discussion of anger progresses.

As Americans, we see anger glorified throughout our culture, from movies to music, sports to politics. Despite our supposed Judeo-Christian foundation, we have movements in the country that promote violence and greed over diplomacy and charity. As our young people’s generation comes to define itself (or, to put it in the passive voice, lets itself be defined by others) as “ironic,” it also grows indifferent to irony’s cousin, hypocrisy. Sarcasm provides an easy segue from skepticism to cynicism, providing many a political pundit on both ends of the political spectrum with the equivalent of sniper’s bullets.

When anger wears us down into a numbed state of depression, anger’s inward-turned doppelganger, we shrug our shoulders and try to focus our attention elsewhere. For some, this may translate into another glass of wine, another dose of Xanax, another marathon session watching the Real Housewives of Whatever County spit their venomous barbs at one another. Other folks may start in on the next level of “Angry Birds,” one of the highest-grossing games in our country. Or perhaps you want to take a virtual trip around the world—killing people and blowing things up along the way—in America’s top game of the Christmas season, Call of Duty: Modern Warfare 3. What a wonderful gift to commemorate the birth of the Prince of Peace. (See how easily the sarcasm comes?)

Many players of these games claim that such pastimes are cathartic—that they help “release tension” and “blow off steam” at the end of a stressful day. If that were truly the case, violent movies and first-person shooter games would leave viewers and players in states of blissful repose. Instead, they ramp up the emotions and boost the adrenalin. (Full disclosure: I play an occasional hour or two of “World of Warcraft” myself at the end of a busy day, so I know that to be successful as a warrior, you need to “generate rage.” It’s right there in the game manual.)

So maybe the term cathartic is a canard when we choose violence-based entertainment as a relief or release of our internal anger and frustration. I’d argue that the proper word is indulgent. Pressing further, I’d express concern that a more appropriate adjective might be catalytic. America seems to like things super-sized and hyper-accelerated, so it’s no surprise that when it comes to anger, amplification isn’t just acceptable; it’s preferable.

An admission: cathartic, indulgent, and catalytic are big words. I’m a writer, so I sometimes use big words. That’s because language, like anger, fascinates me. They’re both acts of expression that have rich, sometimes hidden, roots and origins. Example: I wrote a poem about one such instance, the word decimate. Many people think it means “to destroy completely and indiscriminately.” In fact, the word is based on the Latin root for the number ten and originally meant a methodical act of slaughter in which exactly one victim in ten was killed. (Ironic, eh?) The meanings of words may evolve over time, but the origins of their species are there for all to comprehend and appreciate.

But I digress. Let’s return to the notion of anger as a cathartic force and set forth a little thought experiment. Imagine that you’re a parent dealing with a red-faced child whose inexplicable rage has sent cereal, milk, and orange juice flying across the kitchen. To calm the child, would you—

  1. put on some soothing, New Age music and send the child into the corner for a five-minute “time out” period of self-reflection?
  2. tell the kid to march off to his/her room and go the f*ck to sleep?
  3. tell the child to imagine having an automatic weapon in his/her hands during a stressful, high-stakes combat mission whose outcome will determine the fate of all mankind?
  4. ask the child, “Why are you so angry?”

Now imagine America as a red-faced child.

Modern child-rearing gurus recommend option d. Many advise parents to respond to their children’s extreme behaviors with the expression “Use your words.” This doubles as both an encouragement of self-expression and a redirection of energy. It’s a graceful dance step that moves the child away from visceral reaction toward more cerebral creation. Emotions, meet intellect. Intellect, say hello to emotions.

To some, however, “use your words” is just so much poppycock. To quote the blogger MetroDad, a rather literate and opinionated New Yorker: “I think it’s a bullshit mantra that only helps raise the next generation of pussies.” Like it or not, that’s using your words.

In some ways, “use your words” promotes a form of therapy. It seeks to replace the outburst with what we might call the “inburst,” a breaking-and-entering of the psyche in order to see what secrets are hidden in the closets or nailed beneath the floorboards. We ask a child “what’s really bothering you?” with an expectation of stolen snacks or missing pets, but sometimes the answer shocks and surprises. I’d argue that this is true even when we as adults ask the question of ourselves.

It’s no surprise that many people view creative expression as a form of therapy. Just read the inexhaustible output of writers writing about writing, a quite profitable if overindulgent niche market. We’ve even “verbed” the word “journal.” Did you know that people who journal frequently are able to reduce their stress and manage their anger more efficiently? I could say the same thing about blogging, but then there’s that quote up above from MetroDad. (I kid MetroDad. His blog entries are actually quite amusing, entertaining, and even insightful.)

Too often these days, when it comes to using our words, people settle for quick fixes rather than deep introspection. It’s the 140-character Tweet of the daily pet peeve versus Plato’s lifetime of examination. I’m not suggesting that everyone sign up for therapy sessions, but I do ask friends and colleagues to strive for clarity and honesty in their communications. That often requires work. True expression isn’t effortless.

Even as I write this, I am surrounded by reference materials. As a writer, it often isn’t enough simply to “use your words.” As you’ve noticed, I often rely on the words of others, be they expressed in song or psalm, poetry or prose, book or blog. I would be lost without the dictionary, the thesaurus, the atlas, the encyclopedia, and the patient guidance of my editor/husband—even though all of those things can tempt me along time-consuming tangents with their fascinating insights. Likewise, I am inspired and guided by the works of scholars like Geoffrey Nunberg, whose books and NPR spots on language have both educated and entertained me. Honestly, how many of you get excited when you see an essay entitled “The Politics of Polysyndeton” Hands? Hands? Hello?…

My own fascination with language started in second grade, when my wonderful teacher Miss Burke introduced me to bookmaking with the simplest materials, and it has grown deeper ever since. Even so, one catalytic instant stands out. (Please, if you still don’t know what catalytic means, either look it up on your iPad’s dictionary or ask your car mechanic. After all, these elite, ten-dollar words aren’t reserved for professors holed up in their ivory towers. If you truly love your country, learn the English language. Have I made my appeal clear in both liberalese and conservatese?)

Elie Wiesel, winner of the Nobel Peace Prize in 1986 and the holder of a Congressional Medal of Honor, is another humanitarian hero of mine. Wiesel spent most of his life coming to terms with the violence, anger, and despair he witnessed as a concentration camp prisoner during the Holocaust. I heard him speak about his experiences shortly after he received the Nobel Prize. One of his responses during a question-and-answer session has haunted me ever since.

“Americans,” he stated matter-of-factly, “have one of the most violent languages in the world.”

The truth of that comment struck me. No…it hit me in the face. No…it blindsided me. No…it knocked me out. No…it fell on me like a ton of bricks. No…it blew my mind. No…it bowled me over.

Everywhere I went and everyone I talked to—suddenly, I was keenly aware of the insidious presence of anger and violence in everyday American language. On one occasion, I felt compelled to alert a pacifist minister to her repeated use of violent idioms and imagery in a sermon on compassion. She stood there dumbstruck (as we say), amazed by the horrible truthfulness of the comment.

For a while after hearing Elie Wiesel speak, I too felt dumbstruck, “made silent by astonishment” (to quote Webster). As a writer, I also felt aware in a way I had never felt or experienced before. The Buddhist in me smiled silently. Mindfulness, after all, is one of the key concepts of the practice, summed up simply in the popular mantra “Be here now.”

And so here I am, now, in an American culture defined (in part) by its reactionary anger toward so many things—including each other. I’m struggling to understand that anger, both in myself and in others, and to use my words to describe it. But what do we talk about when we talk about anger?

Defining anger, as I hope to demonstrate in the forthcoming part two, is no easy task, but it’s well worth the effort. Our fate as a nation, if I can ramp up the election year rhetoric, may actually depend on it.

• • •

Playlist for “American Anger”

“Music is food,” says my artist-friend James “Mayhem” Mahan, and so this post comes with a playlist for the full multi-media experience. These are songs that fed my mind as I considered this post and its upcoming parts. It’s also collaborative, so if you’re on Spotify, I encourage you to contribute as well as to listen. Mostly it’s for fun…testing once again how all of this interactive interconnected technology works. Enjoy.

  1. Green Day, “American Idiot”
  2. Public Image, “Rise”
  3. Nine Inch Nails, “Terrible Lie”
  4. Kanye West, “Monster”
  5. Florence and the Machine, “Kiss with a Fist”

(You can listen to and help build this playlist on Spotify here:

American Anger)

Saying Goodbye to Santa Claus

Spoiler Alert: Santa’s “Big Secret” revealed in this blog entry.

Exhibit One: Me with Santa Claus at home in the 1960s, proof positive that he exists.

Now that Santa has flown in, tucked gifts under trees both hither and yon, and headed back to the North Pole for some well-deserved R&R, I feel it’s time to take a look at one of America’s biggest myths and think about how it may have affected us as a nation…or not.

But first, in the spirit of the holiday season, I offer a nostalgic visit to my hometown in Massachusetts circa 1970. Picture plastic candles in each street-facing window and a lacquered pinecone wreath adorned with a festive red felt bow on the front door. If you peer in through the spray-on snow frosting the windows, you can see me carefully filling a plastic garbage bag with dozens of gifts. My parents watch, slightly puzzled but mostly silent, as I pull on my snow boots and mittens, then leave the house, bag slung over my shoulder.

A week or so earlier, I had learned a shocking truth that rocked my little world—a secret that had been kept by nearly every adult I had ever met. They had lied to me, these adults. People whom I had trusted entirely, including the local minister and my own parents, had taken part in an international conspiracy and perpetrated a myth, a fantasy, a fiction. The story included a conveniently distant setting, a saintly protagonist (whom I had met in person on several occasions), and a desirable plotline that evoked grand themes of peace, good will, and generosity. To cover their tracks, my parents had even planted evidence: sleigh bells jangled as sound effects in the wee hours of Christmas morning; cookie crumbs and half-drunk tumblers of milk left on the metal TV table set up alongside the chimney.

All of this was an elaborate scheme that blurred the lines between fiction and nonfiction, between fantasy and reality. Young and gullible, I was easily duped. Of course there was a Santa. Of course reindeer flew. Why even question the physics of how, in one single night, a rather rotund man could pilot a craft to every single household around the world and leave presents for all the good boys and girls—and still have time to toss back some cookies and sip some milk in each abode?

In asking me to believe in such fantastic things, my parents taught me an important lesson that would be vital to my budding literary ambitions: how to suspend disbelief. In doing so, however, they taught a corollary lesson: how to suspend belief. In other words, in order to suspend my disbelief in Santa Claus, I also had to suspend my belief in many of the lessons learned in grade school (science, geography, math, etc.).

In some ways, then, the revelation that Santa Claus was a fabrication probably came as something of a relief to me. The dissonance between fantasy and fact, between what I was being told to believe and what I was learning to be true, lessened. That psychological summary may be a bit too deep to ascribe to an eight-year-old’s consciousness, so let me state it another way: Santa or no, the presents were still there on Christmas morning, and so all was well with the world.

Luckily for me, my parents didn’t serve up the revelation about Santa Claus with a simple “Sorry, kid, but that’s just the way it is.” They discussed the importance of symbolism and how this extended to the Santa myth, claiming that while Santa himself may not be real, the spirit of giving that he represents lives on in the hearts and souls of all those who have heard his story. Any fan of the famous “Yes, Virginia, There Is a Santa Claus” newspaper editorial might have accused my parents of plagiarism, but I could tell they were sincere.

Still: Such power in a fictional tale! Suddenly, my dreams of becoming a fiction writer one day became a vastly more important, almost religious endeavor. See how the power of a story, even a fictional story like Santa Claus, could have such great positive effect on the real human world!

And so I headed off into the night with my makeshift Santa sack. Inside I had placed carefully wrapped toys and books for the kids, and on the second and third winters’ visits, some ribbon candy for the adults in each household. I carried on the tradition until, one year, something unexpected happened. Some families had wrapped and readied gifts and treats for me. Somewhat embarrassed by their assumption that I expected something in return, I ended the Christmas Eve tradition that same year.

For years, I forgot about this bit of personal history. I was recently reminded of it by an article about a Vermont teacher accused of being unprofessional and irresponsible for spilling the beans about Santa in a fifth-grade classroom. The teacher had asked students to list names of famous people in American history. In order to keep the lesson focused on facts, the teacher felt compelled to leave figures such as Winnie the Pooh, Harry Potter, and Santa Claus off the list. (I could not tell from the article if she allowed the also-mentioned Jeff Foxworthy and Justin Bieber to remain on the list, but that’s another discussion for another time.)

(The full article is here:

http://www.reformer.com/ci_19576402?IADID=Search-www.reformer.com-www.reformer.com)

The mother who raised the “unprofessional” and “irresponsible” charges against the teacher went on to say that teaching about Santa Claus was like teaching about religion: the topic is best set aside with recommendations to ask one’s parents about such things. That seems fair enough…until I thought about the goals of education in general.

Since a good part of my day job (writing and editing educational materials) relies on the various state standards developed by school boards (many of them quite conservative) around the country, I know that “learning to distinguish between fantasy and reality” is a pretty important benchmark in the lower grades. (Keep in mind that the instance noted above took place in a fifth-grade classroom.) In other words, children are required to differentiate between nonfiction and fiction (fairy tales, myths, legends, and the like). Teachers are required to provide students with the skills and strategies to do this. By fifth grade, then, your average American student should have the reasoning skills to figure out the Santa thing on her/his own. Any parent who disagrees risks spotlighting their children as slow learners—perhaps along with themselves.

According to research done by psychiatrists at Ithaca College and Cornell University in the 1990s, the average American child learns the truth about Santa at age 7 1/2. However, after interviewing 500 elementary-school children, they discovered that “Many children kept up the charade after they knew the truth…because they did not want to disappoint their parents.”

Parents, take a moment to reflect one the meaning of that last clause (no pun intended). Your kids may be duping you into believing that they still believe in Santa. I think back on my own behavior as a pseudo-Santa and wonder if that was, in some warped way, an effort to turn the lies my parents had told me into truths…ergo, my parents had not lied to me after all.

Further, Dr. John Condry, one of the authors of the Ithaca/Cornell study, reported, “Not a single child told us they were unhappy or upset by their parents having lied about Santa Claus. The most common response to finding out the truth was that they felt older and more mature. They now knew something that the younger kids didn’t.”

(You can read more about the study here:
http://www.nytimes.com/1991/11/21/garden/parent-child.html?pagewanted=2&src=pm)

This finding surprised me. “Not a single child”? Parents, take another moment to think about telling your child that he or she cannot have a toy or candy bar that he or she has already selected while you were shopping at the grocery store. When you took the item away, was your child calm and well-mannered about it? Or was the response similar to those submitted for a recent Jimmy Kimmel spot in which the talk show host asked parents to tell their children, “Hey, sorry, I ate all your Halloween candy.” (Permissions permitting, the videotaped results of this rather non-academic study are here: http://www.huffingtonpost.com/2011/11/03/jimmy-kimmels-ate-halloween-candy-challenge_n_1074334.html)

In a 2006 opinion piece in the New York Times, Jaqueline Woolley wrote, “Children do a great job of scientifically evaluating Santa. And adults do a great job of duping them. As we gradually withdraw our support for the myth, and children piece together the truth, their view of Santa aligns with ours. Perhaps it is this kinship with the adult world that prevents children from feeling anger over having been misled.” What is this “kinship with the adult world” of which Woolley writes? Is it the tacit understanding that adults lie, and that it is OK for them to lie (or “support a myth”) on a grand scale?

(The link to the Woolley article is here:
http://www.nytimes.com/2006/12/23/opinion/23woolley.html?_r=1&oref=slogin)

Surely someone sees this Santa thing differently. For balance, I turned to a group whose opposition to myths and distortions is part and parcel of their identity: the objectivists. This group is huge these days with Republicans and the Tea Party, both of which have renewed a fervent interest in the writings of Ayn Rand, particularly as it applies to self-determination and self-interest. Surely, the somewhat socialist “give liberally to the poor children of the world” Santa myth (I base that description on the story’s historical roots in relation to Saint Nicholas, who, by the way, was also the patron saint of pawnbrokers) would be anathema to such a group. And it is.

According to Andrew Bernstein, a senior writer of the Ayn Rand Institute, “”Santa Claus is, in literal terms, the anti-Christ. He is about joy, justice, and material gain, not suffering, forgiveness, and denial.” Another quote from the article: “The commercialism of Christmas, its emphasis on ingenuity, pleasure, and gift buying, is the holiday’s best aspect—because it is a celebration, the achievement of life.”

(You can read the full piece, a celebration of the commercialism of Christmas, here:
http://www.aynrand.org/site/News2?news_iv_ctrl=1263&page=NewsArticle&id=7632)

All of this leaves me as puzzled about Santa Claus as I was when I learned the dark secret of his nonexistence. To this day, I give presents that have “From Santa” scrawled on the tag, and I try to mask my own handwriting despite the fact that the recipients know they’re from me. Likewise, I love surprise presents: gifts that appear out of the blue from anonymous sources, those random acts of kindness that rekindle our faith in human generosity. (Special kudos to Ben and Jerry’s for a coupon they once published that granted a free ice cream cone to the person in line behind you at one of their scoop shops. Brilliant.)

The spirit of Santa lives on and is no lie. It survives despite the increases in greed and entitlement—both running rampant through our society today, malignant cancers that question and threaten human compassion and generosity. I’d even argue that the spirit of Santa, despite its secularization over the decades, also maintains its ties to the spirits of nearly every religion, even those that claim independence from mythology or dogma.

In the years ahead, perhaps we can pull that spirit back from fiction and establish it fully as year-round fact. After all, nearly every child longs for Santa to be more than a seasonal fantasy. Maybe it is up to the child within us adults to make it so.

Postscript: I dedicate this blog entry to my father (pictured above as Santa) who passed away in 2011 and was very dearly missed this Christmas season. His many gifts to me continue to resonate throughout my life.

Cursing the Darkness

"Better than Cursing" - a self-portrait from sometime back in the 1980s, when I was experimenting with my first SLR camera.

“It’s better to light one candle than to curse the darkness.”

   Around this solstice, when the nights are at their longest, the quote above provides a rich opportunity for reflection. Derived from a Chinese proverb, the quote eventually became the motto of a group called the Christopher Society in 1945. Their mission statement directs the group to “to encourage people of all ages, and from all walks of life, to use their God-given talents to make a positive difference in the world.” Years later, Peter Benenson, the founder of Amnesty International, alluded to the proverb during a Human Rights Day ceremony in 1961. The saying most likely inspired the group’s current logo, a single candle entwined within barbed wire. Most famously, perhaps, the proverb was paraphrased by Adlai Stevenson, the U.S. Ambassador to the United Nations, in a tribute to Eleanor Roosevelt shortly after her death in 1962. “I have lost more than a beloved friend,” Stevenson said. “I have lost an inspiration. She would rather light a candle than curse the darkness, and her glow has warmed the world.”
   We’ve heard many references to light, warmth, and “that special glow” during the holidays. They figure prominently in nearly every secular and religious celebration of the season. Likewise, we’ve also endured a fair share of quips and criticisms of those same sentiments. These aren’t entirely without merit, especially given the escalation of consumerism (and its twin sibling, capitalism) in American culture. You could argue throughout the entire long night over whether to call December 25 Christmas, Xmas, or ¢hri$tma$. Some people seem to make good money doing just that on the cable news networks. For them, it seems, Shakespeare was a prescient pundit when he wrote the line “Now is the winter of our discontent.”
   For many Americans, this is indeed a winter of discontent. Some find themselves in dire circumstances unexpectedly and unwillingly; others seem determined to foster ill will and contempt as if to spite those in search of a spark in the darkness. Cynicism has infected the populace, manifested most conspicuously by vocal members of the Tea Party and Occupy Wall Street movements. Our political leaders, present and possibly future, bicker and argue, preferring to snipe at one another in an effort to score points rather than work together on real solutions to the day’s problems. Even literature and music, once places of refuge and sanctuary for me, have become overloaded with sass and snark. (More on this phenomenon in subsequent blog entries.)
   In short, we find ourselves in dark times. What I wish for most of all this holiday season is a cultural solstice of sorts, a return to what truly matters in our lives. For me, this requires action and education on all our parts, the opposites of laziness and ignorance. We should be fostering community and passionate discourse, not demonizing one another and trading cheap shots in some petty game of one-upsmanship. We should, in short, illuminate one another’s lives, and reflect the light cast on us by others in the spirit of giving back and returning favor.
   In closing, I offer up this quote from the New Testament of the Christian Bible. I’d prefer to change the word “armour” to “vestments,” but who am I to edit the work of countless translators and editors before me? I chanced across the line while doing research for this post. Then, when I went to verify the wording in my own Bible, the book opened directly to the proper page. Make of that what you will.
The night is far spent, the day is at hand: let us therefore cast off the works of darkness, and let us put on the armour of light. (Romans 13:12)